Google Hacking History by Bishop Fox
History of major events affecting the topic of Google Hacking.
History of major events affecting the topic of Google Hacking.
2002-12-08 06:27:03
googleDorks collection begins by Johnny Long
Johnny Long begins to collect interesting Google searches and labels them googleDorks.
2004-05-27 06:27:03
Foundstone SiteDigger v1 released
Foundstone SiteDigger v1 released. SiteDigger searches Google’s cache to look for vulnerabilities, errors, configuration issues, proprietary information, and interesting security nuggets on web sites.
2004-10-05 10:22:53
Google Hacking Database (GHDB) officially begins
Google Hacking Database (GHDB) officially begins. http://www.hackersforcharity.org/ghdb/ From Johnny's blog on Oct 5, 2004: The googledorks databse has been renamed. The new title, the Google Hacking Database (GHDB) more accurately reflects the fact that this is more than just a hobby now. Thanks to the members of the Search Engine Hacking Forums (click here), the moderators that keep things running smoothly, and the overwhelming press around this topic, the database is literally the original and most comprehensive list of Google hacking queries on the planet. The GHDB has done so well, that we're working harder than ever to integrate it into the leading tools out there. To that end, we've ported the GHDB to work with both Athena and SiteDigger. These ports will be available shortly. Thanks for your continued support, and welcome aboard. We're glad you're here.
2005-01-06 06:27:15
Foundstone SiteDigger v2 released
SiteDigger searches Google’s cache to look for vulnerabilities, errors, configuration issues, proprietary information, and interesting security nuggets on web sites.
2005-02-13 06:27:15
Google Hack Honeypot first release
Google Hack Honeypot is the reaction to a new type of malicious web traffic: search engine hackers. GHH is a “Google Hack” honeypot. It is designed to provide reconaissance against attackers that use search engines as a hacking tool against your resources. GHH implements honeypot theory to provide additional security to your web presence.
2005-02-20 06:27:15
Google Hacking v1 released by Johnny Long
Google Hacking v1 released by Johnny Long
2006-01-10 06:27:15
MSNPawn v1.0 released by NetSquare
MSNPawn - Footprinting, Profiling & Assessment with MSN Search. MSNPawn has been designed and developed on the .Net framework and must be installed on the system. The following utilities have been bundled with MSNPawn: MSNHostFP - Supply an IP Address or IP Address range to fetch all possible virtual hosts or application running on each IP addresses. MSNDomainFP - Supply a domain name to fetch the top 50 child domains, considering the supplied domain name as parent. MSNCrossDomainFP - Supply an application domain to fetch the top 50 domains pointing to this particular domain on the Internet. MSNCrawler - Supply a domain or application name to fetch all possible links crawled by the search engine. MSNFetch - Supply a domain and rules file. The tool will run each rule in the file against the domain specified and fetch the first five results of the resultant query. This can help in assessing an application. Search.MSN - Provides place to run your search against MSN and gather all URLs. MSNPawn White Paper: http://net-square.com/msnpawn/MSNPawn_research_usage.pdf
2006-07-17 06:27:15
MWSearch released - Malware executable searching via Google
HD Moore releases MWSearch, a tool that searches for malware executables via Google Binary Search. Using a database of digital fingerprints of known malware--called "signatures"--the Malware Search tool uses the popular search engine to find a number of known worms and viruses. It was developed by HD Moore, the researcher best known as the developer of the widely used Metasploit hacking tool. http://www.pcworld.com/article/126450/new_tool_searches_google_for_malware.html http://djtechnocrat.blogspot.com/2006/07/mwsearch-finding-malware-with-google.html
2006-12-05 06:27:15
Google stops issuing Google SOAP API keys
When Google stopped issuing new Google SOAP API keys in 2006, it was the beginning of the end for all of the Google hacking tools available at the time. The Google SOAP API was interface that hacking tools used to make Google queries. These tools hobbled along with partial functionality until Sept 2009 when Google closed down the Google SOAP API entirely. Because of this, the art of Google hacking and technique/tool development was pretty much stagnant from 2006-2009.
2007-03-29 06:27:15
Bing disables inurl: link: and linkdomain:
Specifically disabled to prevent Google Hacking type techniques against Bing.
2007-11-02 06:27:15
Google Hacking v2 released
Google Hacking v2 released
2008-03-01 02:02:10
cDc Goolag - gui tool released
cDc (Cult of the Dead Cow) releases a GUI driven tool for Google Hacking called Goolag.
2008-10-31 02:02:10
Google Alerts adds RSS feed capabilities
Google Alerts adds RSS feed capability, giving you updated access to search results. This will pave the way for the future Diggity Hacking Alert feeds.
2009-04-01 02:02:10
Initial FOCA footprinting tool releases
FOCA FREE 3.0.2 is a tool for carrying out processes and information gathering fingerprinting in web audit work. Free version makes finding servers, domains, URLs and documents published, as well as the discovery of versions of software on servers and clients. FOCA became famous for extracting metadata of public documents, but today is much more than that.
2009-09-07 02:02:10
Google shuts down SOAP Search API
All existing Google Hacking tools cease to function at this point. When Google stopped issuing new Google SOAP API keys in 2006, it was the beginning of the end for all of the Google hacking tools available at the time. The Google SOAP API was interface that hacking tools used to make Google queries. These tools hobbled along with partial functionality until Sept 2009 when Google closed down the Google SOAP API entirely. Because of this, the art of Google hacking and technique/tool development was pretty much stagnant from 2006-2009.
2009-11-01 02:02:10
Binging tool released by Blueinfy
Binging - Footprinting and Discovery Tool Binging is a simple tool to query Bing search engine. It will use your Bing API key and fetch multiple results. This particular tool can be used for cross domain footprinting for Web 2.0 applications, site discovery, reverse lookup, host enumeration etc. One can use various different directives like site, ip etc. and run queries against the engine. On top of it tool provides filtering capabilities so you can ask for unique URLs or hosts. It is also possible to filter results by applying power of regular expression. Get your Bing API key and use this tool for your audit, assessment and research. http://www.darknet.org.uk/2009/11/binging-beta-footprinting-discovery-tool-google-hacking/
2009-12-01 02:02:10
FoundStone SiteDigger v 3.0 released
FoundStone SiteDigger v 3.0 released, using new Google Ajax API. SiteDigger 3.0 searches Google’s cache to look for vulnerabilities, errors, configuration issues, proprietary information, and interesting security nuggets on web sites.
2010-01-01 02:02:10
Googlag.org disappears
IP address went from 75.126.102.193 to 10.4.223.196. Still available on PacketStorm. http://www.goolag.org http://toolbar.netcraft.com/site_report?url=http://www.goolag.org/
2010-02-18 02:02:10
Shodanhq.com domain registered
SHODAN - Hacker Search Engine. Indexed and makes searchable service banners for whole Internet for HTTP (Port 80), as well as some FTP (23), SSH (22) and Telnet (21) services.
2010-04-21 02:02:10
Stach & Liu - Google Hacking Diggity Project initial releases
The Google Hacking Diggity Project is a research and development initiative dedicated to investigating the latest techniques that leverage search engines, such as Google and Bing, to quickly identify vulnerable systems and sensitive data in corporate networks. This project page contains downloads and links to our latest Google Hacking research and free security tools. Defensive strategies are also introduced, including innovative solutions that use Google Alerts to monitor your network and systems.
2010-07-29 02:02:10
Stach & Liu Unveils Google/Bing Diggity Hacking Alert RSS Feeds
Stach & Liu Unveils Google/Bing Diggity Hacking Alert RSS Feeds at Black Hat USA 2010. Defensive strategies for protecting your organization from Google Hacking attacks traditionally have been limited, mostly falling back on the approach of “Google Hack yourself”. This approach has several shortcomings. While a few free tools exist that allow security staff to Google Hack their organization, they typically are inconvenient, only utilize one search engine, and provide only a snapshot in time your organization’s exposure. Stach & Liu has created the first ever truly defensive tools to help protect your organization from having their vulnerabilities exposed via Google, Bing, and other popular search engines. These tools are comprised of two major types: Alert RSS Feeds and Alert RSS Monitoring Tools. Together, they form a type of intrusion detection system (IDS) for Google hacking.
2010-11-01 02:02:10
Google AJAX API slated for retirement
Beginning of the end for McAfee SiteDigger v3.0 and other tools using the Google Ajax API.
2010-11-09 02:02:10
GHDB Reborn Announced – Exploit-db.com
Picking up maintenance of the GHDB.
2011-06-30 10:05:07
Yale oversight exposes 43,000 Social Security numbers via Google Hacking
On June 30, 2011, a Yale alumni googling his own name discovered an Excel spreadsheet containing his social security number and those of 43,000 other alumni exposed on a public Yale FTP server that had been indexed by Google. https://www.cnet.com/news/yale-oversight-exposes-43000-social-security-numbers/ http://www.huffingtonpost.com/2011/08/24/yale-social-security-numbers-google-hacking_n_935400.html http://www.doj.nh.gov/consumer/security-breaches/documents/yale-university-20110809.pdf
2011-11-14 04:37:56
CommonCrawl.org - Now Everyone Can Be Google - 14Nov2011
Common Crawl is a nonprofit 501(c)(3) organization that crawls the web and freely provides its archives and datasets to the public. Common Crawl's web archive consists of petabytes of data collected since 2011. It completes crawls generally every month. The Common Crawl corpus contains petabytes of data collected since 2012. It contains raw web page data, extracted metadata and text extractions. The Common Crawl dataset lives on Amazon S3 as part of the Amazon Public Datasets program. CommonCrawl: http://commoncrawl.org/ https://en.wikipedia.org/wiki/Common_Crawl https://github.com/commoncrawl/ https://www.i-programmer.info/news/136-open-source/3320-common-crawl.html http://search.slashdot.org/story/11/12/19/0154227/mapreduce-for-the-masses-with-common-crawl-data
2012-01-15 02:02:10
Google Code Search shuts down
Google Code Search shuts down.
2012-03-05 02:02:10
InformationWeek Reports: Using Google to Find Vulnerabilities In Your IT Environment
InformationWeek Reports releases Google Hacking paper written by Fran Brown.
2012-04-12 02:02:10
Bing API to start charging $40/month for 20,000 queries
For the past several years, the Bing Search API has made search data available for developers to innovate and build upon. Today we are announcing that the Bing Search API will transition to an offering made available on the Windows Azure Marketplace. The Windows Azure Marketplace is a one stop shop for cloud data, apps, and services, including the Microsoft Translator API. Through this platform, developers can access hundreds of data sets and APIs and distribute their applications through the marketplace. With the transition, Bing Search API developers will have access to fresher results, improved relevancy, and more opportunities to monetize their usage of the Search API. To offer these services at scale, we plan to move to a monthly subscription model. Developers can expect subscription pricing to start at approximately $40 (USD) per month for up to 20,000 queries each month. http://www.bing.com/community/site_blogs/b/developer/archive/2012/04/12/bing-dev-update.aspx https://datamarket.azure.com/dataset/5BA839F1-12CE-4CCE-BF57-A49D98D29A44
2012-09-12 02:02:10
ISSA Journal – SearchDiggity: Dig Before They Do
Stach & Liu’s Google Hacking Diggity Project was featured in the toolsmith article SearchDiggity: Dig Before They Do, found in the September 2012 edition of The ISSA Journal.
2013-05-08 02:02:10
NSA releases own guide to Google Hacking and other Internet research
There’s so much data available on the internet that even government cyberspies need a little help now and then to sift through it all. So to assist them, the National Security Agency produced a book to help its spies uncover intelligence hiding on the web. The 643-page tome, called Untangling the Web: A Guide to Internet Research (.pdf), was just released by the NSA following a FOIA request filed in April by MuckRock, a site that charges fees to process public records for activists and others. The book was published by the Center for Digital Content of the National Security Agency, and is filled with advice for using search engines, the Internet Archive and other online tools. But the most interesting is the chapter titled “Google Hacking.” http://www.wired.com/2013/05/nsa-manual-on-hacking-internet/ https://www.nsa.gov/public_info/_files/Untangling_the_Web.pdf http://search.slashdot.org/story/13/05/09/1434237/the-nsas-own-guide-to-google-hacking-and-other-internet-research
2013-06-13 02:02:10
SearchDiggity - Bishop Fox Edition
Release of new SearchDiggity v 3.1 - the first version to be rebranded under Bishop Fox. Also includes brand new, significantly updated CHM help file. Also migrated BingDiggity to new Bing Search API.
2013-07-01 02:02:10
Google Reader Retires
Google Reader retires. As a result, the Google Diggity Alerts FUNDle Bundle, which was bundled using Google Reader, is broken. Google Alerts also has temporarily suspended its RSS capabilities, taking down the Google Hacking Alerts for now.
2013-12-06 09:21:23
NikolaiT - GoogleScraper - Python module to scrape several search engines – initial release
A Python module to scrape several search engines (like Google, Yandex, Bing, Duckduckgo, Baidu and others) by using proxies (socks4/5, http proxy) and with many different IP's, including asynchronous networking support (very fast). https://github.com/NikolaiT/GoogleScraper http://scrapeulous.com/googlescraper-260-keywords-in-a-second.html https://vimeo.com/user24568030 https://pypi.python.org/pypi/GoogleScraper/ https://web.archive.org/web/20160717065524/http://incolumitas.com/ https://web.archive.org/web/20160514220352/http://incolumitas.com/pages/googlescraper-py/#googlescraper-py
2014-07-30 09:21:23
SHODAN switches to new site: Shodan.io
SHODAN switches over to new URL, from old http://www.shodanhq.com/ to new https://www.shodan.io/.
2014-08-25 02:02:10
Netflix - Announcing Scumblr and Sketchy - Search, Screenshot, and Reclaim the Internet
Many security teams need to stay on the lookout for Internet-based discussions, posts, and other bits that may be of impact to the organizations they are protecting. These teams then take a variety of actions based on the nature of the findings discovered. Netflix’s security team has these same requirements, and today we’re releasing some of the tools that help us in these efforts. Scumblr is a Ruby on Rails web application that allows searching the Internet for sites and content of interest. Scumblr includes a set of built-in libraries that allow creating searches for common sites like Google, Facebook, and Twitter. For other sites, it is easy to create plugins to perform targeted searches and return results. Once you have Scumblr setup, you can run the searches manually or automatically on a recurring basis. http://techblog.netflix.com/2014/08/announcing-scumblr-and-sketchy-search.html http://threatpost.com/netflix-open-source-security-tools-solve-range-of-challenges/107931 https://github.com/netflix/scumblr
2014-08-25 11:05:23
Feds Issue Bulletin on Google Dorking
A bulletin issued by the Department of Homeland Security, the FBI and the National Counterterrorism Center earlier this month warns law enforcement and private security personnel that malicious cyber actors can use “advanced search techniques” to discover sensitive information and other vulnerabilities in websites. The bulletin, titled “Malicious Cyber Actors Use Advanced Search Techniques,” describes a set of techniques collectively referred to as “Google dorking” or “Google hacking” that use “advanced operators” to refine search queries to provide more specific results. http://publicintelligence.net/feds-google-dorking/ https://publicintelligence.net/dhs-fbi-nctc-google-dorking/ https://info.publicintelligence.net/DHS-FBI-NCTC-GoogleDorking.pdf http://www.computerworld.com/article/2597539/cybercrime-hacking/feds-issue-bulletin-warning-about-malicious-google-dorking-cyber-actors.html http://arstechnica.com/security/2014/08/feds-warn-first-responders-of-dangerous-hacking-tool-google-search/
2016-01-15 09:21:23
Google Developers Blog - Retirement of certain Google search APIs
Back in 2011, we announced the deprecation of the following APIs: Google Patent Search API, Google News Search API, Google Blog Search API, Google Video Search API, Google Image Search API. We supported these APIs for a three year period (and beyond), but as all things come to an end, so has the deprecation window for these APIs. We are now announcing the turndown of the above APIs. These APIs will cease operations on February 15, 2016. You may wish to look at our Custom Search API as an alternative for these APIs.
2016-03-27 09:21:23
Google Search Technique Aided N.Y. Dam Hacker in Iran
Earlier this week, the US blamed Iranian hackers for a series of attacks in 2012 and 2013 on several targets, including a New York City dam. How did the accused hacker get access to this dam? He Googled it, according to the Wall Street Journal. Partner Francis Brown’s Diggity Project is in the Business Insider – Something Called ‘Google Dorking’ Helps Hackers Find Out Stuff No One Wants Them to Know. https://www.wsj.com/articles/google-search-technique-aided-n-y-dam-hacker-in-iran-1459122543 https://www.businessinsider.com/google-dorking-helps-hackers-2016-4 https://www.cnet.com/news/a-hackers-next-target-is-just-a-web-search-away-google-dorking/
2016-11-04 20:51:33
Azure Data Market “Bing Search” and “Web Results Only” APIs to be decommissioned December 15, 2016
Enter story info hereWith a little over one month before December 15, End of Life for the Azure Data Market “Bing Search” and “Bing Web Results Only” API offerings is quickly approaching. At Bing we are continuously hard at work to improve our offerings and are excited that the next version of our APIs are now part of Microsoft Cognitive Services. These new APIs bring with them exciting new capabilities and will replace the Bing APIs currently available on the Azure Data Market. API releases contain important updates leading to great new experiences, more innovation and new features to delight our customers. Regular updates enable our teams to schedule changes in a predictable cadence and allow our customers to plan their work and take advantage of the new innovations, features, and industry standards in a timely manner. Partners who are currently using the APIs via the Azure Data Market will have the option to migrate to the Microsoft Cognitive Services Search API offerings prior to December 15, 2016. If you choose not to upgrade to the new API offering, you will automatically lose access to your existing Bing Search or Bing Web Results Only Azure subscriptions at 5:00 PM Pacific Time on December 15, 2016 and calls to these API endpoints will no longer return any data. https://blogs.msdn.microsoft.com/bingdevcenter/2016/11/04/azure-data-market-bing-search-and-web-results-only-apis-to-be-decommissioned-december-15/ https://msdn.microsoft.com/en-US/library/mt707570.aspx
2017-06-15 04:37:56
dorkbot – Command-Line Tool For Google Dorking - 15June2017
dorkbot – Command-Line Tool For Google Dorking - released 15June2017 https://github.com/utiso/dorkbot https://www.darknet.org.uk/2018/02/dorkbot-command-line-tool-for-google-dorking/
2018-01-26 04:37:56
Open Source Intelligence Techniques - 6th Edition Released - 26Jan2018
Open Source Intelligence Techniques: Resources for Searching and Analyzing Online Information - 6th Edition Released - 26Jan2018
2018-05-11 12:37:16
SecurityAffairs.co - Google Dorks to mine passwords from dozens of public Trello boards - 11May2018
A “Security enthusiastic” found a vulnerability in the Trello web management and now with a simple dork is possible to query to mine passwords from dozens of public Trello boards.
2018-11-05 04:37:56
Google dorks were root cause of CIA compromise, resulting in over 30 dead spies - Nov 2018
Google dorks were the root cause of a catastrophic compromise of CIA communications resulting in over 30 dead spies - Nov 2018 Nov 2018 - CIA Spies Killed by Google Dorks https://securityaffairs.co/wordpress/77701/intelligence/cia-communications-dismantled.html https://www.yahoo.com/news/cias-communications-suffered-catastrophic-compromise-started-iran-090018710.html https://www.theregister.co.uk/2018/11/02/iran_cracked_cia_google/ https://foreignpolicy.com/2018/08/15/botched-cia-communications-system-helped-blow-cover-chinese-agents-intelligence/ https://www.nytimes.com/2017/05/20/world/asia/china-cia-spies-espionage.html
2019-07-17 00:50:08
Modern Google Dorks - httphost.github.io - initial release - 17July2019
Modern Google Dorks - This forever updating list is a modern collection of Google Dorks. They are based on the filter allinurl: For example: allinurl:content.php?id=
2019-07-18 06:48:16
GitGot by Bishop Fox - Search GitHub for sensitive secrets - Initial Tool Release - 18July2019
GitGot is a semi-automated, feedback-driven tool to empower users to rapidly search through troves of public data on GitHub for sensitive secrets. During search sessions, users will provide feedback to GitGot about search results to ignore, and GitGot prunes the set of results. Users can blacklist files by filename, repository name, username, or a fuzzy match of the file contents. Read more about the semi-automated, human-in-the-loop design here: https://know.bishopfox.com/blog/going-semi-automated-in-an-automated-world-using-human-in-the-loop-workflows-to-improve-our-security-tools https://github.com/BishopFox/GitGot
2019-07-18 17:46:23
SecurityAffairs.co - Scraping the TOR for rare contents - 18July2019
Scraping the “TOR hidden world” is a quite complex topic. First of all you need an exceptional computational power (RAM mostly) for letting multiple runners grab web-pages, extracting new links and re-run the scraping-code against the just extracted links. Plus a queue manager system to manage scrapers conflicts and a database to store scraped data need to be consistent. Second, you need great starting points. In other words you need the .onion addresses where your scrapers start from.