ShadowTrackr

Log in >
RSS feed

Adding and removing assets through the API

03 May 2020
I love automating stuff. If you do this properly from the start you can do so much more work in so much less time. Really, any task you do more than twice should be automated if possible.

ShadowTrackr power users that want to automate things can now add and remove asset through the API. In bulk that is. Just throw a mixed list of urls, ips and subnets at it and it will validate, deduplicate and add it for you. Check out the details in the API documentation.

And if you have any cool API idea I’m always happy to hear them. Have fun!

Website scanning in-depth

19 April 2020
Scanning a website seems easy, and it is if you just do a one-off, single scan for a url.

Things get more interesting when you host your website on multiple servers (for better performance or reliability). You probably also have both ipv4 and ipv6 addresses available. Your website runs on HTTPS, and you want your visitors to be able to find you without typing in the protocol too. So you also have HTTP configured. That’s two protocol versions, on two versions of ip addresses and maybe multiple hosts.

Some websites run in the cloud. You can limit this to a specific cloud in a specific country (which most governments do with their websites), but you can also have the cloud provider figure out what the best spot is. If you do this with Azure, your website will get served from the nearest Azure cloud. ShadowTrackr has nodes all over the world, and this means we’ll be able to detect your website in multiple clouds. That’s on purpose of course, but it does complicate things.

Then there are CDNs like Cloudflare and Akamai. You host your website on a server where the CDN can reach you, and they handle all your visitor requests. You’ll need a trick to point your visitors at the CDN of course, and this is where it gets ugly for scanners. There are multiple ways of doing this and these can be mixed and matched. On top of this some CDN hire subcontractors that are really hard to attribute and you might end up detecting Vodafone instead of Akamai.

It seemed so easy to scan a website, but in practice it can get really complex. The goal has always been for ShadowTrackr to detect all your website instances on all internet-reachable hosts, including clouds and CDNs. I had underestimated how complex this is and did not achieve the goal from the start. After getting it wrong a couple of times, this week’s update features a much improved algorithm. This might result in a storm of new websites being found on your timeline. I’m on it and regularly clean things up until they are all properly ingested and monitored.

If you do find irregularities, or have any other questions, drop me a line.

UX and GUI improvements

05 April 2020
The last two weeks were mostly spent on improving the graphical user interface (GUI) and user experience (UX).

ShadowTrackr is continuously growing. Sometimes functionality is improved, sometimes features are added, and things get "bolted on". Over time, clear and readable webpages will devolve into Frankenpages.

Two weeks of tender love and care should have fixed that now. I’m happy with the new layout. Warnings and errors are more consistent, and many asset properties are now clickable links. These links will generate a search query and show you all other assets with the same properties.

Finally, the exports have improved too, try mailing yourself a certificate, website, host or domain page. Just click "mail report" in the context menu (three dots, right upper corner). These reports are intended to mail to the IT staff that has to fix things. You can directly mail them from your account, without having to go trough copy-paste hell. Hopefully this lower the bar and encourages you to improve your security posture even more.

I hope you like the update as much as I do. If you have any suggestions or special requests, please let me know!
Older posts >

Resources
API
Blog
Documentation
Apps
Web
iOS
Android