I want to use HTTP GET and POST commands to retrieve URLs from a website and parse the HTML. How do I do this?
-
Use hc.apache.org/httpclient-3.xNick Holt– Nick Holt2008-12-11 14:05:54 +00:00Commented Dec 11, 2008 at 14:05
-
I have used JTidy in a project and it worked quite well. A list of other parsers is here, but besides from JTidy I don't know any of them.Markus– Markus2008-12-11 17:55:41 +00:00Commented Dec 11, 2008 at 17:55
Add a comment
|
2 Answers
You can use HttpURLConnection in combination with URL.
URL url = new URL("http://example.com");
HttpURLConnection connection = (HttpURLConnection)url.openConnection();
connection.setRequestMethod("GET");
connection.connect();
InputStream stream = connection.getInputStream();
// read the contents using an InputStreamReader
2 Comments
Johnny Maelstrom
Thank you. This shows the most basic way to do it. It's simple with an understanding of what's necessary to do a simple URL connection. However, the longer term strategy would be to use [HTTP Client ](hc.apache.org/httpcomponents-client/index.html "HTTP Client") for more advanced and feature rich ways to complete this task.
rockit
Create a BufferedReader using the InputStream to read the content into a string variable
The easiest way to do a GET is to use the built in java.net.URL. However, as mentioned, httpclient is the proper way to go, as it will allow you among others to handle redirects.
For parsing the html, you can use html parser.