18

I want to use HTTP GET and POST commands to retrieve URLs from a website and parse the HTML. How do I do this?

2
  • Use hc.apache.org/httpclient-3.x Commented Dec 11, 2008 at 14:05
  • I have used JTidy in a project and it worked quite well. A list of other parsers is here, but besides from JTidy I don't know any of them. Commented Dec 11, 2008 at 17:55

2 Answers 2

21

You can use HttpURLConnection in combination with URL.

URL url = new URL("http://example.com");
HttpURLConnection connection = (HttpURLConnection)url.openConnection();
connection.setRequestMethod("GET");
connection.connect();

InputStream stream = connection.getInputStream();
// read the contents using an InputStreamReader
Sign up to request clarification or add additional context in comments.

2 Comments

Thank you. This shows the most basic way to do it. It's simple with an understanding of what's necessary to do a simple URL connection. However, the longer term strategy would be to use [HTTP Client ](hc.apache.org/httpcomponents-client/index.html "HTTP Client") for more advanced and feature rich ways to complete this task.
Create a BufferedReader using the InputStream to read the content into a string variable
3

The easiest way to do a GET is to use the built in java.net.URL. However, as mentioned, httpclient is the proper way to go, as it will allow you among others to handle redirects.

For parsing the html, you can use html parser.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.