Re-inventing Cybersecurity from the ground up

Backed by a team of skilled engineers and analysts, Trawlr aims to revolutionise the
cyber security industry with leading data aggregation services.

READ MORE

End-to-end data normalisation services

Through partnerships with some of the worlds biggest feed providers, and our own network of crawlers and honeypots, we collect large amounts of inbound phishing and malware related data.

This data is fed through a normalisation service which collates and indexes it, checks its validity, and stores it in preparation for analysis via our machine learning tools.

LEARN MORE

Aggregation

Data is sourced from public and private feeds, and collated into a single stream.

Normalisation

The inbound stream is processed to reduce data redundancy and increase integrity and accuracy.

Evaluation

Machine learning and human interaction is used to analyse the data and calculate a impact risk score.

Data re-imagined, from the ground up

We may be new to the game, but that doesn’t mean we don’t dream big. Using industry-leading data collection architectures and machine learning for analytics, risk identification and reporting, we generate high quality aggregated phishing data feeds.

We’re always on the lookout for new data to integrate. If you would like to join us and contribute or if you would like to integrate and consume our data streams, reach out to our engineering team!

REACH OUT

Our Trusted Partners

We work with the best in the industry to provide the best service to our customers