diff --git a/docs/docs/main/index.html b/docs/docs/main/index.html index b520601..2b4233a 100644 --- a/docs/docs/main/index.html +++ b/docs/docs/main/index.html @@ -239,6 +239,71 @@
The goal is to keep Lookyloo as focused as possible on the rendering of URLs an ease their investigation but +there are quite a few usecases that are either covered by other tools that existed before, or required custom development.
+Initially, the code using Playwright to capture URLs was integrated to Lookyloo itself with +PlaywrightCapture but capturing an URL is a fairly common task +(see Ail Framework) +so it made sense to split it into a dedicated and standalone project called lacus. +The advantage of using Lacus is that you can run the browsers loading arbitrary URLs on a dedicated machine that could +potentially be compromised. PyLacus is the python module ou can use to integrate +Lacus with your own tool.
+Note that Lookyloo itself doesn’t requires a standalone lacus as it will fallback to LacusCore +and run the capture on the same machine. If you want to implement a similar fallback mechanism to be able to either pick PyLacus or Lacuscore in +your own project, have a look at the documentations of the respective projects, the API is made in a way it is relatively easy.
+Capturing one single URL is nice, but sometimes you want to monitor it. It can either be in order to see if something unexpected changes (defacement), +but also to be informed when a phishing website has been taken down, or to be informed when a newly registered domain that was a parking page becomes +something else. That’s where the monitoring platform becomes useful. +When enabled, you can trigger a monitoring session from Lookyloo, or via the python module, PyLookylooMonitoring.
+The monitoring plarform will automatically notify you when something changes between the last two captures. The diff is done on all the URLs up to the +final redirect, and by comparing the ressources loaded on that page (with the possibility to exclude some).
+In order to get contact info for IPs and domains, it is handy to be able to get the relevant whois entry. uWhois +will do that, but also keep a record of that entry, offering a WhoWas service too.
+Sometime, URLs point to a file, and Lookyloo itself can’t do anything with that so if you enable the Pandora +connector, you can submit that file (or any file encountered during the capture) to a Pandora instance and investigate it from there.
+And if your pandora is configured that way, you can also submit a URL from there to Lookyloo.
+Lookyloo will extract a lot of indicators out of the URL captures, and these indicators will be correlated across the captures on that lookyloo instance. +It is not made (and won’t be) to either search on other Lookyloo instance, or share indicators with other systems (but you could implement it yourself +using PyLookyloo if you really want to). +The recommended way to do that is to use MISP as a storing/sharing platform for the indicators.
+After building and startup is complete lookyloo should be available at link: -http://localhost:5000/.
+http://localhost:5100/.If you want to persist the data between different runs uncomment the "volumes" definition in the last two lines of [docker-compose.yml](docker-compose.yml) and define a data storage directory in your Docker host system there.
diff --git a/docs/sitemap.xml b/docs/sitemap.xml index d64dfa2..613d429 100644 --- a/docs/sitemap.xml +++ b/docs/sitemap.xml @@ -2,150 +2,150 @@