V O O K I
I N F O S E C

Penetration Testing

Steps to perform Penetration Testing

Initial Steps

Initial steps to start the scan are as:

  • Choose the Penetration Testing from the left navigation bar.
  • Enter the URL that you want to scan.
  • Select the browser as per your preference from the list and click on Launch.
  • You can also choose "Any Browser" (Manual Configuration) to connect the browser with manual configuration. Choosing this option will not open a browser. It will only open the proxy port.
  • After the browser is launched, crawl all the pages of the website and interact with all the input on the pages.

Intercepting

To intercept the request follow these steps:

  • Go to the intercept tab and turn on the intercept.
  • Now you will start intercepting the request.
  • Here you can edit the request, and to send it to the server, click on Send to Server.
  • To drop the request, click on the Drop.
  • To get the response and edit it before displaying it on the front end, click on Break on Response.
  • To use the request later for testing, send it to compose by clicking on Send to Compose.

Compose

To use the compose feature follow these steps:

  • You can add the request to compose by intercepting the request and sending it by clicking on Send to composer.
  • You can also add the request to compose from the history table by right clicking the request and send to compose.
  • Click on the "Compose" tab to view all the available requests.
  • From here, you can edit the request and use it as per your needs.

Scanning

To start the scan follow these steps:

  • After the crawling of pages is done, right-click on the left tree node (on your host) and click Scan.
  • On the click of scan you will get 4 pop up, provide input in all of them.
  • Scan Configuration
  • Crawler
  • Authentication
  • CSRF Token Generation
  • Scan Configuration: Concurrent Request to send the number of parallel request, Web Crawler Timeout lets you set timeout for the crawling request send, Scan Request Timeout lets you set the timeout for the scan.

    Crawler: If you click on yes, then it will start crawling the website. Our crawling mechanism performs in-depth scanning of your website. You can identify the webpages exposed on the website.

    Authentication: For authentication we have several modes, which are as follows:

    • Fetch a session cookie from a proxy.
    • Manually enter session cookie.
    • Simple form authentication.
    • Complex authentication.

    CSRF Token Generation: This module is to bypass the CSRF token. If your scanning website has the CSRF providing the token key and value and click on Check & Save then click Scan. If CSRF check is not available, click on Skip & Scan.

    • After the scan is completed, you can generate the report and save the data externally.

    Report Generation

    To generate the report follow these steps:

    • Click Generate Report and select the report type you want.
    • This will generate the report based on your selection.
    • Save the file in your prefferred location.

    Saving scan externally

    There we have two option to save the scanned data extenally. To save the scan data follow these steps

    • After the completion of the scan we get notification to save the data. To save click on ok and choose the preffered location and save the data.
    • To save the scanned result afterwards, click on Save Scan Result and choose the preferred location.
    •     Basic Scan
    • Command Line