In this short tutorial we are going to run through how automations work and how you can use automations inside your workflows to perform work actions.
Whilst scraping recipes enable you to design your own bot to capture data from any website automations on the other hand are pre-built and enable you to perform specific tasks or leverage 3rd party services inside your workflow.
At this point we have the following automations available:
Data input – enables you to provide a list of urls or text to scraping recipes or automations
For lead generation and research:
Google search – enables you to extract search results from Google as a service
Email scraper – enables you to extract email addresses from any URL
Social media links scraper – enables you to extract social media profiles from any URL
Traffic analytics – enables you to get traffic estimates for any domain
WHOIS – enables you to get WHOIS contact and domain expiry information on any domain
Tech stack – enables you to get a list of technologies and 3rd party libraries used by any page
Meta tags – enables you to extract all the meta tags used on any page
For making changes to the data you collect:
Number transformation – enables you to change number formatting in your data
Date transformation – enables you to change date formats in your data
Text transformation – enables you to change text formatting in your data
Measurement converter – enables you to change one unit to another in your data
We also have notification automations including:
Telegram – send a notification via Telegram
Slack – send a notification via Slack
Discord – send a notification via Discord
With more to come soon. You can find out more about each automation in the “Automations” section of Hexomatic.
How to use automations?
To use any automation or scraping recipe you need to first create a new workflow. Then you will be able to drag and drop the automations you want to use.
Let’s run through a quick example using the Google search automation to find some leads and the email + social media link scrapers to find contact information for each website.
1/ To get started create a new workflow:
2/ Then add the Google search automation, specifying your search query and the geographic location you are targeting. In this example, let’s look for dentists in NY.
3/ Next add the social links scraper and use the URL field from the Google search automation as the source. This will run the social links scraper on every URL listed in the search results.
You can then specify which social profile links you are interested in, for example Facebook, Twitter, LinkedIn etc…
4/ Next add the email scraper and use the URL field from the Google search automation as the source. This will look for email addresses listed on every URL found in the search results.
5/ Name your automation and click Save.
You can now choose to run the task now or you can schedule it to run in the future or on a weekly / monthly basis.
When the workflow has completed you will be able to download the results in a .csv file.
Automate & scale time-consuming tasks like never before
CMO & Co-founder
Helping entrepreneurs automate and scale via growth hacking strategies.
Follow me on Twitter for life behind the scenes and my best learnings in the world of SaaS.