In this short tutorial we are going to run through how Hexomatic works and core concepts to keep in mind to run your workflows.
Hexomatic consists of three core elements: Scraping recipes, automations and workflows.
The key point to remember is that everything in Hexomatic is run inside a workflow, and this is where you add scraping recipes or automations and decide the output and scheduling of your workflow.
Let’s look into each core element in more detail:
1/ Scraping recipes
Scraping recipes enable you to create your own bots to scrape data from any website via a point and click browser. There are two ways of using scraping recipes:
-You can create a scraping recipe based on a single search result or category page to automatically capture all the listing names and urls (paginating through all the pages available).
-Or you can provide a list of product or listing page urls (using the data input automation) and for each url scrape specific fields (keep in mind all pages need to share the same html template).
Scraping recipes enable you to capture the following elements from a page:
-Source URLS (for example image urls)
To get started simply load a page and then click on any element. Hexomatic then enables you to:
-Select that specific element
-Select all matching elements on the page (for example all the product titles or URLS).
See our Scraping recipe tutorial for more information.
To run your scraping recipes, simply create a new workflow and drag and drop your scraping recipe. Workflows also enable you to chain additional scraping recipes or automations as well as specify whether to run this one time, or with a recurring schedule.
Automations on the other hand are pre-built and enable you to perform actions inside a workflow, for example fetching search results from Google, detecting social media or email contact details, transforming text, dates or numbers, performing translations etc…
You can find out more about each available automation in the “Automations” section and to run any automation simply create a workflow and drag and drop the automations you would like to use.
See our automation tutorials for more information about how each automation works.
Workflows is where you can run scraping recipes and automations.
The best way to visualize how workflows work is to picture a spreadsheet with columns and rows. When you run a scraping recipe for example you are essentially populating the spreadsheet with rows of structured data you have captured from web pages.
Then when you add automations in your workflow it will run that automation for each row of the spreadsheet based on it’s input. For example if you want to detect social media links for each page, you can run that automation on the URL field.
When you create a workflow, choose the name and drag & drop the scraping recipe or automations you would like to run. For each you will need to specify an input field to run the automation or scraping recipe from.
The great thing about workflows is that you can chain together as many scraping recipes or workflows as you need.
Then when you click continue you will be able to choose whether to run the workflow one time manually or if you would like to run it on a recurring schedule (for example weekly).
You can find all your workflows and their statuses in the “Workflows” section. When a workflow has run successfully, you can then download the results in CSV format.
If a workflow has failed you can find out more by clicking on the workflow and checking the log to debug the issue.
Automate & scale time-consuming tasks like never before
Hexomatic. The no-code, point and click work automation platform.
Harness the internet as your own data source, build your own scraping bots and leverage ready made automations to delegate time consuming tasks and scale your business.
No coding or PhD in programming required.