Techniques for Gathering information in Ethical Hacking
Foot-printing is a way of analysing a target site to obtain information.With this technique we can actually obtain a large amount of information on the target as remotely as possible without leaving any traces and prepare our self for the actual work ahead.
The most common way is known as passive foot printing. This technique
of Passive Foot printing is a way of obtaining information of a target
without actually touching the target in other words being anonymous.
There are two main ways of foot-printing. The techniques discussed below fall into both categories.
1.DNS foot printing
2.Network foot printing
There are two main ways of foot-printing. The techniques discussed below fall into both categories.
1.DNS foot printing
2.Network foot printing
There are multiple techniques involved with passive foot printing. Some of them are discussed below.
- Crawling
Crawling is a way in which we use tools to crawl a target site and identify all paths, folder structures and pages and any hidden information within a target site. One such tool that we can use for crawling is Black Widow. As shown in the screen-shot below with
Note: Some crawlers will ignore any information that we specify within the robort.txt file that is placed within the www folder. However, a tool like BlackWidow has the ability to ignore this files content and crawl all the required paths.
- Whois
This is another way of obtaining required information of a target site without actually visiting the target site. This technique falls under the category of DNS footp-rinting. With this technique we can identify how long a company’s been
around through WHOIS
registration.
All we have to do is to access whois.domaintools.com and specify the domain that we need to obtain the information about. This site provides us information such as the Name servers, dates of registration and point of contact for registrations.
- MX Entries
We may need to obtain information regarding mail servers in our foot-printing activities. Looking at MX Entries(Mail Exchange Entries) is a technique were we obtain information regarding mail server.
- Tracers
Trace roust help us obtain information related to the trace path between users and a target system on a web page. On ubuntu we can use the command line tool traceroute to obtain these information.
ex:sudo traceroute www.google.com > tracecert.txt
cat tracecert.txt
traceroute to www.google.com (173.194.127.178), 30 hops max, 60 byte packets
1 10.100.0.254 (10.100.0.254) 4.727 ms 5.181 ms 5.653 ms
- Archive pages
This is a technique for looking at a way of identifying old data that a target site may have ignored but which may be useful for us in our planned activities. When a company initially starts, they may not have a large emphasis on how to protect certain sensitive information and may have had them on their website to be publicly accessible. With sites such as archieve.org we can access archived pages of old companies and obtain certain information they may have had at that time.
- Social Networking sites
Some times we tend to publish information within chatgroups or social networking sites such as Facebook related to events at workplace. There have been some of our friends who publish information such as "I was struggling to configure the proxyserver the whole day and still couldn’t do any thing". These information, may help us to know that by end of that day the proxy was down and we can try to see if we can bypass a proxy and do what ever we want on an intra-net site of a company.
- Negative sites
Negative sites are those sites that publish information defaming another site, or it could be even a chat forum that discuss bad sales that had happened with a specific web site.
Within these sites people may publish information regarding items purchased, issues they faced, amounts paid and any other unsuspecting information that will help us in our foot prining tasks to achieve our goals of identifying required information of a target.
Comments
Post a Comment