时间:2023-02-12 03:48:01 | 来源:建站知识
时间:2023-02-12 03:48:01 来源:建站知识
site.ip138.com
为例,打开F12
,输入一个IP查询,观察控制台请求,看到下图中信息jsoup
来解析HTML简直完美。jsoup 是一款Java 的HTML解析器,可直接解析某个URL地址、HTML文本内容。它提供了一套非常省力的API,可通过DOM,CSS以及类似于jQuery的操作方法来取出和操作数据。
//解析成Document对象Document document = Jsoup.parse(result);if (document == null) { logger.error("Jsoup parse get document null!");}//根据ID属性“list”获取元素Element对象(有没有感觉很像jQuery?)Element listEle = document.getElementById("list");//根据class属性和属性值筛选元素Element集合,并通过eachText()遍历元素内容return listEle.getElementsByAttributeValue("target", "_blank").eachText();
result的内容通过HttpClient模拟HTTP请求HttpGet httpGet = new HttpGet(url);httpGet.setHeader("Accept", "text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8");httpGet.setHeader("Accept-Encoding", "gzip, deflate");httpGet.setHeader("Accept-Language", "zh-CN,zh;q=0.9");httpGet.setHeader("Cache-Control", "max-age=0");httpGet.setHeader("Connection", "keep-alive");httpGet.setHeader("Cookie", "Hm_lvt_d39191a0b09bb1eb023933edaa468cd5=1553090128; BAIDU_SSP_lcr=https://www.baidu.com/link?url=FS0ccst469D77DpdXpcGyJhf7OSTLTyk6VcMEHxT_9_&wd=&eqid=fa0e26f70002e7dd000000065c924649; pgv_pvi=6200530944; pgv_si=s4712839168; Hm_lpvt_d39191a0b09bb1eb023933edaa468cd5=1553093270");httpGet.setHeader("DNT", "1");httpGet.setHeader("Host", host);httpGet.setHeader("Upgrade-Insecure-Requests", "1");httpGet.setHeader("User-Agent", "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/63.0.3239.132 Safari/537.36");String result = HttpUtils.doGet(httpGet);
HTTP请求工具类public class HttpUtils { private static Logger logger = LoggerFactory.getLogger(HttpUtils.class); public static String doGet(HttpGet httpGet) { CloseableHttpClient httpClient = null; try { httpClient = HttpClients.createDefault(); RequestConfig requestConfig = RequestConfig.custom() .setConnectTimeout(5000).setConnectionRequestTimeout(10000) .setSocketTimeout(5000).build(); httpGet.setConfig(requestConfig); HttpResponse httpResponse = httpClient.execute(httpGet); if (httpResponse.getStatusLine().getStatusCode() == 200 || httpResponse.getStatusLine().getStatusCode() == 302) { HttpEntity entity = httpResponse.getEntity(); return EntityUtils.toString(entity, "utf-8"); } else { logger.error("Request StatusCode={}", httpResponse.getStatusLine().getStatusCode()); } } catch (Exception e) { logger.error("Request Exception={}:", e); } finally { if (httpClient != null) { try { httpClient.close(); } catch (IOException e) { logger.error("关闭httpClient失败", e); } } } return null; }}
新增Controller@RestControllerpublic class DomainSpiderController { private static Logger logger = LoggerFactory.getLogger(DomainSpiderController.class); @Autowired private DomainSpiderService domainSpiderService; /** * @param ip 119.75.217.109 * @return */ @RequestMapping("/spider/{ip}") @ResponseBody public List<String> domainSpider(@PathVariable("ip") String ip) { long startTime = System.currentTimeMillis(); List<String> domains = domainSpiderService.domainSpiderOfIp138(ip); if(domains == null || domains.size() == 0) { domains = domainSpiderService.domainSpiderOfAizan(ip); } long endTime = System.currentTimeMillis(); logger.info("完成爬虫任务总耗时:{}s", (endTime - startTime) / 1000); return domains; }}
启动Spring Boot应用,访问浏览器:http://localhost:8080/spider/119.75.217.109 获得返回结果如下: dns.aizhan.com
码云
和Github
上,欢迎下载学习关键词:地址,实现,根据,爬虫