Another Word For It Patrick Durusau on Topic Maps and Semantic Diversity

September 4, 2018

Tor Sites – Is Your Public IP Showing? [Terrorist-in-a-Box]

Filed under: Cybersecurity,Dark Web,Tor — Patrick Durusau @ 9:32 am

Public IP Addresses of Tor Sites Exposed via SSL Certificates by Lawrence Abrams.

From the post:

A security researcher has found a method that can be used to easily identify the public IP addresses of misconfigured dark web servers. While some feel that this researcher is attacking Tor or other similar networks, in reality he is exposing the pitfalls of not knowing hwo to properly configure a hidden service.

One of the main purposes of setting up a dark web web site on Tor is to make it difficult to identify the owner of the site. In order to properly anonymize a dark web site, though, the administrator must configure the web server properly so that it is only listens on localhost (127.0.0.1) and not on an IP address that is publicly exposed to the Internet.

The failure of people who intentionally walk on the wild side to properly secure their sites holds out great promise that government and industry sites are even more poorly secured.

If you are running a Tor site or someday hope to run a Tor site, read this post and make sure your public IP isn’t showing.

Unless your Tor site is a honeypot for government spy agencies. They lap up false information like there is no tomorrow.

Not something I have time for now but consider mining intelligence reports as a basis for creating a Tor site, complete with information, chats, discussion forums, etc., download (not public) name “Terrorist-in-a-Box.” Unpack, install, configure (correctly) and yet another terrorist site is on the Dark Web. Have an AI running all the participants on the site. A challenging project to make it credible.

The intelligence community (IC) makes much of their ability to filter noise from content, so you can help them test that ability. It’s almost a patriotic duty.

August 2, 2018

Archives for the Dark Web: A Field Guide for Study

Filed under: Archives,Dark Web,Ethics,Journalism,Tor — Patrick Durusau @ 4:48 pm

Archives for the Dark Web: A Field Guide for Study by Robert A. Gehl.

Abstract:

This chapter provides a field guide for other digital humanists who want to study the Dark Web. In order to focus the chapter, I emphasize my belief that, in order to study the cultures of Dark Web sites and users, the digital humanist must engage with these systems’ technical infrastructures. I will provide specific reasons why I believe that understanding the technical details of Freenet, Tor, and I2P will benefit any researchers who study these systems, even if they focus on end users, aesthetics, or Dark Web cultures. To this end, I offer a catalog of archives and resources researchers could draw on and a discussion of why researchers should build their own archives. I conclude with some remarks about ethics of Dark Web research.

Highly recommended read but it falls short on practical archiving advice for starting researchers and journalists.

Digital resources, Dark Web or no, can be emphemeral. Archiving produces the only reliable and persistent record of resources as you encountered them.

I am untroubled by Gehl’s concern for research ethics. Research ethics can disarm and distract scholars in the face of amoral enemies. Governments and their contractors, to name only two such enemies, exhibit no ethical code other than self-advantage.

Those who harm innocents, rely on my non-contractual ethics at their own peril.

January 31, 2018

Don’t Mix Public and Dark Web Use of A Bitcoin Address

Filed under: Cybersecurity,Dark Web,Privacy,Security — Patrick Durusau @ 10:30 am

Bitcoin payments used to unmask dark web users by John E Dunn.

From the post:

Researchers have discovered a way of identifying those who bought or sold goods on the dark web, by forensically connecting them to Bitcoin transactions.

It sounds counter-intuitive. The dark web comprises thousands of hidden services accessed through an anonymity-protecting system, usually Tor.

Bitcoin transactions, meanwhile, are supposed to be pseudonymous, which is to say visible to everyone but not in a way that can easily be connected to someone’s identity.

If you believe that putting these two technologies together should result in perfect anonymity, you might want to read When A Small Leak Sinks A Great Ship to hear some bad news:

Researchers matched Bitcoin addresses found on the dark web with those found on the public web. Depending on the amount of information on the public web, identified named individuals.

Black Letter Rule: Maintain separate Bitcoin accounts for each online persona.

Black Letter Rule: Never use a public persona on the dark web or a dark web persona on the public web.

Black Letter Rule: Never make Bitcoin transactions between public versus dark web personas.

Remind yourself of basic OpSec rules every day.

January 8, 2018

16K+ Hidden Web Services (CSV file)

Filed under: Dark Web — Patrick Durusau @ 5:00 pm

I subscribe to the Justin at Hunchly Dark Web report. The current issue (daily) and archive are on Dropbox.

The daily issues are archived in .xlsx format. (Bleech!)

Yesterday I grabbed the archive, converted the files to CSV format, catted them together, cleaned up the extra headers and that resulted in a file with 16,814 links. HiddenServices-2017-07-13-2018-01-05.zip.

A number of uses come to mind, seed list for seach engine, browsing by title, sub-setting for more specialized dark web lists, testing presence/absence of sites on sub-lists, etc.

I’m not affliated with Hunch.ly but you should give their Inspector Hunchly a look. From the webpage:

Inspector Hunchly toils in the background of your web browser to track, analyze and store web pages while you perform online investigations.

Forgets nothing, keeps everything.
… (emphasis in original)

When using Inspector Hunchly, be mindful that: Anything you record, can and will be discovered.

PS: The archive I downloaded, separate files for every day, 272.3 MB. My one file, 363.8 KB. Value added?

December 10, 2017

Incomplete Reporting – How to Verify A Dark Web Discovery?

Filed under: Cybersecurity,Dark Web,Security — Patrick Durusau @ 4:50 pm

1.4 Billion Clear Text Credentials Discovered in a Single Database by Julio Casal.

From the post:

Now even unsophisticated and newbie hackers can access the largest trove ever of sensitive credentials in an underground community forum. Is the cyber crime epidemic about become an exponentially worse?

While scanning the deep and dark web for stolen, leaked or lost data, 4iQ discovered a single file with a database of 1.4 billion clear text credentials — the largest aggregate database found in the dark web to date.

None of the passwords are encrypted, and what’s scary is the we’ve tested a subset of these passwords and most of the have been verified to be true.

The breach is almost two times larger than the previous largest credential exposure, the Exploit.in combo list that exposed 797 million records. This dump aggregates 252 previous breaches, including known credential lists such as Anti Public and Exploit.in, decrypted passwords of known breaches like LinkedIn as well as smaller breaches like Bitcoin and Pastebin sites.

This is not just a list. It is an aggregated, interactive database that allows for fast (one second response) searches and new breach imports. Given the fact that people reuse passwords across their email, social media, e-commerce, banking and work accounts, hackers can automate account hijacking or account takeover.

This database makes finding passwords faster and easier than ever before. As an example searching for “admin,” “administrator” and “root” returned 226,631 passwords of admin users in a few seconds.

The data is organized alphabetically, offering examples of trends in how people set passwords, reuse them and create repetitive patterns over time. The breach offers concrete insights into password trends, cementing the need for recommendations, such as the NIST Cybersecurity Framework.
… (emphasis in original)

The full post goes onto discuss sources of the data, details of the dump file, freshness and password reuse. See Casal’s post for those details.

But no links were provided to the:

“…largest trove ever of sensitive credentials in an underground community forum.

How would you go about verifying such a discovery?

The post offers the following hints:

  1. “…single file … 1.4 billion clear text credentials”
  2. dump contains file “imported.log”
  3. list shown from “imported.log” has 55 unique file names

With #1, clear text credentials, I should be able to search for #2 “imported.log” and one of fifty-five (55) unique file names to come up with a fairly narrow set of search results. Not perfect but not a lot of manual browsing.

All onion search engines have .onion addresses.

Ahmia Never got to try one of the file names, “imported.log” returns 0 results.

Caronte I entered “imported.log,” but Caronte searches for “imported log.” Sigh, I really tire of corrective search interfaces. You? No useful results.

Haystack 0 results for “imported.log.”

Not Evil 3973 “hits” for “imported.log.” With search refinement, still no joy.

Bottom line: No verification of the reported credentials discovery.

Possible explanations:

  • Files have been moved or renamed
  • Forum is password protected
  • Used the wrong Dark Web search engines

Verification is all the rage in mainstream media.

How do you verify reports of content on the Dark Web? Or do you?

November 28, 2017

Onion Deep Web Link Directory

Filed under: Dark Web,Searching — Patrick Durusau @ 2:11 pm

Onion Deep Web Link Directory (http://4bcdydpud5jbn33s.onion/)

Without a .onion address in hand, you will need to consult an .onion link list.

This .onion link list offers:

  • Hidden Service Lists and search engines – 23 links
  • Marketplace financial and drugs – 25 links
  • Hosting – 6 links
  • Blogs – 18 links
  • Forums and Chans – 12 links
  • Email and Messaging – 8 links
  • Political – 11 links
  • Hacking – 4 links
  • Warez – 12 links
  • Erotic 18+ – 7 links
  • Non-English – 18 links

Not an overwhelming number of links but enough to keep you and a Tor browser busy over the coming holiday season.

FYI, adult/erotic content sites are a primary means for the distribution of malware.

Hostile entity rules of engagement apply at all .onion addresses. (Just as no one “knows” you are a dog on the Internet, an entity found at a .onion address, could be the FBI. Act accordingly.)

I first saw this in the Hunchly Daily Hidden Services Report for 2017-11-03.

November 27, 2017

Berlusconi Market (Dark Web)

Filed under: Dark Web — Patrick Durusau @ 9:50 am

Berlusconi Market (http://55j6kjwki4vjtmzp.onion/)

This notice on Berlusconi Market brings a smile:

Due to high traffic, we are overwhelmed by tickets, vendor applies ecc. If any order-related error occurs, please contact us. Your money is safe. BM Staff

Hmmm, I’m using an allegedly non-traceable connection to a non-traceable website to connect with a non-traceable vendor, yet, I’m assured my money is safe.

That’s a big ask. 😉

The numbers will change but as of today:

Fraud 676
Drugs & Chemicals 2807
Guides & Tutorials 325
Counterfeit Items 145
Digital Products 347
Jewels & Gold 2
Weapons 75
Carded Items 20
Services 72
Software & Malware 20
Security & Hosting 10
Other Listings 33

Anonymous sources are as trustworthy as any government. Use security precautions suitable for a known hostile entity.

PS: As I cover useful Dark Web sites, I will be giving their .onion addresses. Not listing Dark Web addresses is a juvenile antic at best.

November 25, 2017

Daniel’s Hosting Service – Dark Web

Filed under: Dark Web — Patrick Durusau @ 10:42 am

Daniel’s Hosting Service (Onion Address)

Hunchly Daily Hidden Services Report delivers a daily email with hidden services discovered in the last 24 hours.

One of the common entries I have seen in those daily reports reads:

Site hosted by Daniel’s hosting service

which isn’t all that informative. 😉

I decided to check out Daniel’s Hosting Service (Onion Address) and found:

Here you can get yourself a hosting account on my server.

What you will get:

  • Free anonymous webhosting
  • Chose between PHP 7.0, 7.1 or no PHP support
  • Nginx Webserver
  • SQLite support
  • 1 MariaDB (MySQL) database
  • PHPMyAdmin and Adminer for web based database administration
  • Web-based file management
  • FTP access
  • SFTP access
  • No disk quota
  • mail() can send e-mails from your.onion@dhosting4okcs22v.onion (your.onion@hosting.danwin1210.me for clearnet)
  • Webmail and IMAP, POP3 and SMTP access to your mail account
  • Mail sent to anything@your.onion gets automatically redirected to your inbox
  • Your own .onion address
  • On request your own clearnet domain or a free subdomain of danwin1210.me. I can setup an I2P domain as well.
  • There is a missing feature or you need a special configuration? Just contact me and I’ll see what I can do.
  • Empty/Unused accounts will be automatically deleted after a month
  • More to come…

Rules

  • No child pornography!
  • No terroristic propaganda!
  • No illegal content according to German law!
  • No malware! (e.g. botnets)
  • No phishing!
  • No scams!
  • No spam!
  • No shops, markets or any other sites dedicated to making money! (This is a FREE hosting!)
  • No proxy scripts! (You are already using TOR and this will just burden the network)
  • No IP logger or similar de-anonymizer sites!
  • I preserve the right to delete any site for violating these rules and adding new rules at any time.

After reading the rules, I wondered, “Is this the dark web or not?” 😉

The list of sites hosted by Daniel can be found at: http://dhosting4okcs22v.onion/list.php, some 1409 public sites (577 hidden) as of today.

Daniel maintains an “Onion link list,” 5926 links, with this disclaimer:

I’m not responsible for any content of websites linked here. Be careful and use your brain.

Daniel has other resources and a simple registration form for a site.

I haven’t used Daniel’s hosting service but will in the near future to “kick the tires” so to speak.

Caution: This is a “free” dark web hosting service, so ask yourself what economic model is in play? If you are truly hidden from Daniel, how does he make a commodity out of you or your use of his service?

No reflection on Daniel as a person, assuming there is a Daniel person and he isn’t working for some intelligence service.

Intelligence services that steal, kidnap and murder based on whim and caprice are not above mis-representing themselves on the Dark Web.

That’s hard news to take, that intelligence services aren’t “playing fair” but it’s a fact. Intelligence services don’t play fair.

Prepare and play accordingly.

October 28, 2017

Useless List of Dark Web Bargains – NRA Math/Social Science Problems

Filed under: Cybersecurity,Dark Web,Malware,Security — Patrick Durusau @ 3:00 pm

A hacker’s toolkit, shocking what you can buy on Dark Web for a few bucks by Mark Jones.

From the post:

Ransomware

  • Sophisticated license for widespread attacks $200
  • Unsophisticated license for targeted attacks $50

Spam

  • 500 SMS (Flooding) $20
  • 500 malicious email spam $400
  • 500 phone calls (Flooding) $20
  • 1 million email spam (legal) $200

What makes this listing useless? Hmmm, did you notice the lack of URLs?

With URLs, a teacher could create realistic math problems like:

How much money would Los Vegas shooting survivors and families of the deceased victims have to raise to “flood” known NRA phone numbers during normal business hours (US Eastern time zone) for thirty consecutive days? (give the total number of phone lines and their numbers as part of your answer)

or research problems (social science/technology),

Using the current NRA 504c4 report, choose a minimum of three (3) directors of the NRA and specify what tools, Internet or Dark Web, you would use to find additional information about each director, along with the information you discovered with each tool for each director.

or advanced research problems (social science/technology),

Using any tool or method, identify a minimum of five (5) contributors to the NRA that are not identified on the NRA website or in any NRA publication. The purpose of this exercise is to discover NRA members who have not been publicly listed by the NRA itself. For each contributor, describe your process, including links and results.

Including links in posts, even lists, helps readers reuse and even re-purpose content.

It’s called the World Wide Web for a reason, hyperlinks.

July 26, 2017

Weaponry on the Dark Web – Read The Fine Print

Filed under: Dark Web,Journalism,News,Reporting — Patrick Durusau @ 2:01 pm

The NextGov headline screaming: 3D-Printed Gun Designs Are Selling For $12 On The Dark Web is followed by this pic:

But the fine print in the caption reads:

The additive-manufactured RAMBO system includes an NSRDEC-designed standalone kit with printed adjustable buttstock, mounts, grips and other modifications—modifications made possible by the quick turnaround time afforded by 3D printing. // US Army

So….

  1. This is NOT a printable gun from the Dark Web
  2. Printable parts ARE buttstock, mounts, grips, not the gun itself

Just so you know, the RAND paper doesn’t include this image. 😉

In fact, Behind the curtain: The illicit trade of firearms, explosives and ammunition on the dark web by Giacomo Persi Paoli, Judith Aldridge, Nathan Ryan, Richard Warnes, concede trading of weapons on the Dark Web is quite small beside non-Dark Web trafficking.

Missing in the discussion of 3-D weapons plans is a comparison of the danger they pose relative to other technologies.

The Cal Fire map leaves no doubt that $12 or less in gasoline and matches can produce far more damage than any 3-D printed weapon. Without the need for a 3-D printer.

Yes?

All weapons pose some danger. Decisions makers need to know the relative dangers of weapons vis-a-vis each other.

A RAND report on the comparative danger of weapons would be far more useful than reports on weapons and their sources in isolation.

December 11, 2016

Dark Web Data Dumps

Filed under: Dark Web — Patrick Durusau @ 10:05 pm

Dark Web Data Dumpsby Sarah Jamie Lewis.

From the webpage:

A collection of structured data obtained from scraping the dark web.

Why

Researchers need more data about the dark web.

The best resource we have right now are the (Black Market Archives)[https://www.gwern.net/Black-market%20archives], scrapping of various marketplaces scrapped by Gwern et al in 2013.

Much has changed since 2013, and complete web dumps, while useful for some research tasks, frustrate and complicate others.

Further, governments & corporations are already building out such data in private & for profit.

Citing
This Resource

Sarah Jamie Lewis. Dark Web Data Dumps, 2016, 10 Dec 2016. Web. [access date] https://polecat.mascherari.press/onionscan/dark-web-data-dumps

Valhalla Marketplace Listings October 2016

Sarah Jamie Lewis. Dark Web Data Dumps, 2016, Octber 2016. Web. [access date] https://polecat.mascherari.press/onionscan/dark-web-data-dumps/blob/master/valhalla-marketplace-listings-2016-10.csv

Supporting

If you would like to support this, and other dark web research, please become a patron.

Valhalla Marketplace Listings runs 1.3 MB and 16511 lines.

Sans the Rolex watch ads, makes great New Year’s party material. 😉

September 4, 2016

Running a Tor Exit Node for fun and e-mails

Filed under: Dark Web,Tor — Patrick Durusau @ 7:34 pm

Running a Tor Exit Node for fun and e-mails by Antonios A. Chariton.

From the post:


To understand the logistics behind running a Tor Exit Node, I will tell you how I got to run my Tor Exit Node for over 8 months. Hopefully, during the process, some of your questions will be answered, and you’ll also learn some new things. Please note that this is my personal experience and I cannot guarantee it will be the same for you. Also, I must state that I have run other exit nodes in the past, as well as multiple non-exit relays and bridges.
…

A great first person account on running a Tor Exit Node.

Some stats after 8 months:

  • It has been running for almost 8 months
  • It costs 4,90 EUR / month. In comparison, the same server in AWS would cost $1,122, or 992€ as of today
  • The total cost to date is 40€. In comparison, the same server in AWS would cost about 8,000€.
  • It is pushing up to 50 Mb/s, every second
  • It relayed over 70 TB of Tor traffic
  • It generated 2,729 Abuse E-Mails
  • It is only blocking port 25, and this to prevent spam
  • It helped hundreds or thousands of people to reach an uncensored Internet
  • It helped even more people browse the Internet anonymously and with privacy

If your not quite up to running an exit node, consider running a Tor relay node: Add Tor Nodes For 2 White Chocolate Mochas (Venti) Per Month.

Considering the bandwidth used by governments for immoral purposes, the observation:


Finally, just like with everything else, we have malicious users. Not necessarily highly skilled criminals, but people in general who (ab)use the anonymity that Tor provides to commit things they otherwise wouldn’t.

doesn’t trouble me.

As a general rule, highly skilled or not, criminals don’t carry out air strikes against hospitals and such.

September 1, 2016

Dark Web OSINT With Python Part Three: Visualization

Filed under: Dark Web,Open Source Intelligence,Python,Tor — Patrick Durusau @ 4:40 pm

Dark Web OSINT With Python Part Three: Visualization by Justin.

From the post:

Welcome back! In this series of blog posts we are wrapping the awesome OnionScan tool and then analyzing the data that falls out of it. If you haven’t read parts one and two in this series then you should go do that first. In this post we are going to analyze our data in a new light by visualizing how hidden services are linked together as well as how hidden services are linked to clearnet sites.

One of the awesome things that OnionScan does is look for links between hidden services and clearnet sites and makes these links available to us in the JSON output. Additionally it looks for IP address leaks or references to IP addresses that could be used for deanonymization.

We are going to extract these connections and create visualizations that will assist us in looking at interesting connections, popular hidden services with a high number of links and along the way learn some Python and how to use Gephi, a visualization tool. Let’s get started!

Jason tops off this great series on OnionScan by teaching the rudiments of using Gephi to visualize and explore the resulting data.

Can you map yourself from the Dark Web to visible site?

If so, you aren’t hidden well enough.

August 10, 2016

Dark Web OSINT with Python Part Two: … [Prizes For Unmasking Government Sites?]

Filed under: Dark Web,Open Source Intelligence,Python,Tor — Patrick Durusau @ 4:31 pm

Dark Web OSINT with Python Part Two: SSH Keys and Shodan by Justin.

From the post:

Welcome back good Python soldiers. In Part One of this series we created a wrapper around OnionScan, a fantastic tool created by Sarah Jamie Lewis (@sarajamielewis). If you haven’t read Part One then go do so now. Now that you have a bunch of data (or you downloaded it from here) we want to do some analysis and further intelligence gathering with it. Here are a few objectives we are going to cover in the rest of the series.

  1. Attempt to discover clearnet servers that share SSH fingerprints with hidden services, using Shodan. As part of this we will also analyze whether the same SSH key is shared amongst hidden services.
  2. Map out connections between hidden services, clearnet sites and any IP address leaks.
  3. Discover clusters of sites that are similar based on their index pages, this can help find knockoffs or clones of “legitimate” sites. We’ll use a machine learning library called scikit-learn to achieve this.

The scripts that were created for this series are quick little one-offs, so there is some shared code between each script. Feel free to tighten this up into a function or a module you can import. The goal is to give you little chunks of code that will teach you some basics on how to begin analyzing some of the data and more importantly to give you some ideas on how you can use it for your own purposes.

In this post we are going to look at how to connect hidden services by their SSH public key fingerprints, as well as how to expand our intelligence gathering using Shodan. Let’s get started!

Expand your Dark Web OSINT intell skills!

Being mindful that if you can discover your Dark Web site, so can others.

Anyone awarding Black Hat conference registrations for unmasking government sites on the Dark Web?

July 30, 2016

The Privileged Cry: Boo, Hoo, Hoo Over Release of OnionScan Data

Filed under: Cybersecurity,Dark Web,Security — Patrick Durusau @ 12:13 pm

It hasn’t taken long for the privileged to cry “boo, hoo, hoo,” over Justin Seitz’s releasing the results of using OnionScan on over 8,000 Dark Web sites. You can find Justin’s dump here.

Joseph Cox writes in: Hacker Mass-Scans Dark Web Sites for Vulnerabilities, Dumps Results:

…Sarah Jamie Lewis, the creator of OnionScan, warns that publishing the full dataset like this may lead to some Tor hidden services being unmasked. In her own reports, Lewis has not pointed to specific sites or released the detailed results publicly, and instead only provided summaries of what she found.

“If more people begin publishing these results then I imagine there are a whole range of deanonymization vectors that come from monitoring page changes over time. Part of the reason I destroy OnionScan results once I’m done with them is because people deserve a chance to fix the issue and move on—especially when it comes to deanonymization vectors,” Lewis told Motherboard in an email, and added that she has, when legally able to, contacted some sites to help them fix issues quietly.

Sarah Jamie Lewis and others who seek to keep vulnerability data secret are making two assumptions:

  1. They should have exclusive access to data.
  2. Widespread access to data diminishes their power and privilege.

I agree only with #2 and it is the reason I support public and widespread distribution of data, all data.

Widespread access to data means it is your choices and abilities that determine its uses and not privilege of access.

BTW, Justin has the better of the exchange:


Seitz, meanwhile, thinks his script could be a useful tool to many people. “Too often we set the bar so high for the general practitioner (think journalists, detectives, data geeks) to do some of this larger scale data work that people just can’t get into it in a reasonable way. I wanted to give people a starting point,” he said.

“I am a technologist, so it’s the technology and resulting data that interest me, not the moral pros and cons of data dumping, anonymity, etc. I leave that to others, and it is a grey area that as an offensive security guy I am no stranger to,” he continued.

The question is: Do you want privileged access to data for Sarah Jamie Lewis and a few others or do you think everyone should have equal access to data?

I know my answer.

What’s yours?

Dark Web OSINT With Python and OnionScan: Part One

Filed under: Dark Web,Open Source Intelligence,Python — Patrick Durusau @ 10:47 am

Dark Web OSINT With Python and OnionScan: Part One by Justin.

When you tire of what passes for political discussion on Twitter and/or Facebook this weekend, why not try your hand at something useful?

Like looking for data leaks on the Dark Web?

You could, in theory at least, notify the sites of their data leaks. 😉

One of the aspects of announced leaks that never ceases to amaze me are reports that read:

Well, we pawned the (some string of letters) database and then notified them of the issue.

Before getting a copy of the entire database? What’s the point?

All you have accomplished is making another breach more difficult and demonstrating your ability to breach a system where the root password was most likely “god.”

Anyway, Justin gets you started on seeking data leaks on the Dark Web saying:

You may have heard of this awesome tool called OnionScan that is used to scan hidden services in the dark web looking for potential data leaks. Recently the project released some cool visualizations and a high level description of what their scanning results looked like. What they didn’t provide is how to actually go about scanning as much of the dark web as possible, and then how to produce those very cool visualizations that they show.

At a high level we need to do the following:

  1. Setup a server somewhere to host our scanner 24/7 because it takes some time to do the scanning work.
  2. Get TOR running on the server.
  3. Get OnionScan setup.
  4. Write some Python to handle the scanning and some of the other data management to deal with the scan results.
  5. Write some more Python to make some cool graphs. (Part Two of the series)

Let’s get started!

Very much looking forward to Part 2!

Enjoy!

Powered by WordPress