Retail

Competitor monitoring on autopilot

In retail with catalogs of 200+ products, tracking prices from 5 competitors manually consumes 5 hours per week and produces outdated data. Automated monitoring with intelligent scraping and daily reports eliminates that manual work entirely.

Result 5h/week → 0
ScrapingReports

The context

In consumer electronics retail, an online store competing directly with 5 retailers in the same segment needs to maintain competitive pricing across a catalog of over 200 active products.

Without automation, an analyst on the commercial team spends every Monday between 4 and 5 hours visiting competitor websites, finding equivalent products, copying prices into a spreadsheet, and comparing them manually. The resulting report reaches the commercial director on Tuesday — with data already a day old.

The challenge

The consumer electronics market is extremely dynamic. Competitors change prices several times a week, sometimes several times a day during promotional campaigns. A weekly report with day-old data is insufficient for making competitive pricing decisions.

The manual process is also error-prone. With 200+ products and 5 competitors, the analyst handles over 1,000 data points. Transcription errors are frequent, and some products get skipped due to time constraints. Pricing decisions end up based on incomplete and outdated data.

The solution

The scraping system is built with n8n and Puppeteer, automatically visiting the websites of all 5 competitors every night. The bot navigates product pages, extracts prices, availability, and active promotions, and stores everything in a structured database.

Every morning at 7:00 AM, the system generates an automatic report that includes: products where the competitor is cheaper, price variations from the previous day, alerts for new promotions, and a price adjustment recommendation based on rules defined by the commercial team.

The report is sent via email and Slack to the commercial director and the pricing team. It includes clear visualizations with color coding: red for products priced above market, green for well-positioned ones, and yellow for variations that need analysis. Typical implementation takes 7 days.

Results

The 5 hours of weekly manual work are completely eliminated. The analyst who used to do the tracking now spends that time on strategic market trend analysis — higher-value work.

The team goes from making pricing decisions with weekly data to daily data. Reaction time to competitor price changes drops from days to hours. In the first quarter, this translates to an 8% increase in average margin on monitored products.

With daily data, every morning the team knows exactly where they stand against the competition and can act before the customer compares. It's like playing chess seeing the full board.

Lessons learned

  • Scraping requires maintenance. Websites change their structure periodically, so an alerting system that detects when a scraper stops working correctly is essential.
  • Automated pricing rules need guardrails. Maximum and minimum caps prevent the system from recommending prices that erode margin below acceptable levels.
  • Historical pricing data turns out to be as valuable as real-time data. After 3 months, the team starts identifying seasonal competitor pricing patterns that were previously invisible.

Facing a similar challenge?

Let's talk