Extract Links from Page

This tool will parse the html of a website and extract links from the page. The hrefs or "page links" are displayed in plain text for easy copying or review.


Find what a page links to with this tool. Internal and external links will be displayed with this information gathering tool. When security testing an organization or web site forgotten and poorly maintained web applications can be a great place to find some weak spots. Dumping the page links is a quick way to find other linked applications, web technologies and related websites.

About the Page Links Scraping Tool

The purpose of this tool is to allow a fast and easy to scrape links from a web page. Listing links, domains and resources that a page links to can tell you a lot about the page. Reasons for using a tool such as this are wide ranging; from Internet research, web page development to security assessments and web page testing.

The tool has been built with a simple and well known command line tool Lynx. This is a text based web browser popular on Linux based operating systems.

Lynx can also be used for troubleshooting and testing web pages from the command line. Being a text based browser you will not be able to view graphics obviously, however it is handy tool for reading text based pages. It was first developed in around 1992 and is capable of using old school Internet protocols including Gopher and WAIS along with the more commonly known HTTP, HTTPS, FTP and NNTP.

Select Web Page to Scrape

Enter a valid url into the form and that page will be downloaded by our system. The HTML will then be analyzed and URL's will be extracted from the results. This technique is otherwise known as scraping.

Scraped Page Results

The results are displayed in a list of url's. There is a link icon on the left that allows for a quick access to the valid link. Note this will take you to the selected URL, it does not initiate a scrape of that page. To perform additional scraping, copy and paste your desired URL into the form and repeat the process.

No Links Found

If you are receiving the message "No Links Found" it may be due to the fact no links were found in the response from the server. If you are hitting HTTP service that redirects to HTTPS or an address that has a redirect you may also receive this message. As the test will not follow links to a new location (301 or 302 redirects). Ensure to enter the URL of the actual page you wish to extract links from.

Command Line

Extracting links from a page can be done with a number of open source command line tools. lynx a text based browser is perhaps the simplest.

lynx -listonly -dump url.example.com

API for the Extract Links Tool

Another option for accessing the extract links tool is to use the API. Rather than using the above form you can make a direct link to the following resource with the parameter of ?q set to the address you wish to extract links from.

https://api.hackertarget.com/pagelinks/?q=websitetotest.com

The API is designed to be used in an ad-hoc fashion not for bulk queries and is like all our IP Tools is limited to 50 (total) requests from a single IP Address per day.

Have you seen our other Free IP and Network Testing tools.
Discover, Explore, Learn.
Find out More
Next level testing with advanced Security Vulnerability Scanners. Trusted tools. Hosted for easy access.
I want to Secure My Systems