How to Set Up Email Alerts

Get notified in Slack, PagerDuty, or any channel when specific emails arrive at your inbound address. Build keyword-triggered alerts using JsonHook webhooks and simple content matching.

Table of Contents
  1. Overview
  2. Prerequisites
  3. Step-by-Step Instructions
  4. Code Example
  5. Common Pitfalls

Overview

Email alerts transform inbound emails into real-time notifications in other channels. Common use cases:

  • Alert your on-call engineer in Slack when a server sends a critical error email
  • Page your team in PagerDuty when a monitoring system sends an alert email
  • Post a Slack message when a high-value lead's email arrives
  • Send a mobile push notification when an important supplier emails
  • Trigger a Zapier/Make workflow when specific keywords appear in an inbound email

The pattern is always the same: email arrives at a JsonHook address → webhook delivers JSON to your handler → handler checks conditions → handler fires the alert if conditions match. The alert can go to any channel that accepts an HTTP call: Slack, PagerDuty, Twilio, Discord, Teams, or any custom API.

Prerequisites

Requirements for email alerting:

  • A JsonHook inbound address configured to receive the relevant emails
  • Webhook URLs or API credentials for your alert destinations (Slack incoming webhook, PagerDuty events API key, etc.)
  • Clear alert conditions: which emails should trigger an alert and what information should the alert contain

Turn Inbound Emails Into Instant Alerts

Email arrives, webhook fires, Slack pings — all in under 3 seconds.

Get Free API Key

Step-by-Step Instructions

Build an email-to-alert pipeline:

  1. Create a JsonHook address pointed at your alert handler webhook.
  2. Define your alert conditions. For example: from a specific sender, subject contains "CRITICAL" or "ERROR", or any email from a monitoring service.
  3. Implement the alert handler — check conditions, format the alert message, POST to the alert channel.
  4. Test with real emails that should and should not trigger the alert to verify your conditions work correctly.
  5. Add alert throttling to avoid notification storms — if 100 monitoring alerts arrive in one minute, you probably only want one Slack message.
  6. Monitor the alert pipeline itself — use the JsonHook delivery log to ensure alert emails are being delivered to your handler.

Code Example

Alert handler that posts to Slack for critical monitoring emails:

import express from "express";
import crypto from "crypto";
import fetch from "node-fetch";

const app = express();
app.use(express.raw({ type: "application/json" }));

// Simple in-memory throttle: allow max 1 alert per 5 minutes per sender
const alertThrottle = new Map<string, number>();

const ALERT_CONDITIONS = [
  { test: (e: any) => /critical|error|down|failed/i.test(e.subject), severity: "high" },
  { test: (e: any) => e.from.includes("monitoring@"), severity: "high" },
  { test: (e: any) => /warning|degraded/i.test(e.subject), severity: "medium" },
];

async function sendSlackAlert(email: any, severity: string) {
  const icon = severity === "high" ? ":red_circle:" : ":warning:";
  await fetch(process.env.SLACK_WEBHOOK_URL!, {
    method: "POST",
    headers: { "Content-Type": "application/json" },
    body: JSON.stringify({
      text: `${icon} *Email Alert* [${severity.toUpperCase()}]`,
      attachments: [{
        color: severity === "high" ? "danger" : "warning",
        fields: [
          { title: "From",    value: email.from,    short: true },
          { title: "Subject", value: email.subject, short: true },
          { title: "Preview", value: (email.textBody ?? "").slice(0, 200) },
        ],
      }],
    }),
  });
}

app.post("/webhooks/alerts", async (req, res) => {
  // ... verify HMAC ...
  const { email, deliveryId } = JSON.parse(req.body.toString());

  const throttleKey = email.from;
  const lastAlert = alertThrottle.get(throttleKey) ?? 0;
  const now = Date.now();

  if (now - lastAlert < 5 * 60_000) {
    console.log(`Throttled alert from ${email.from}`);
    return res.sendStatus(200);
  }

  for (const condition of ALERT_CONDITIONS) {
    if (condition.test(email)) {
      alertThrottle.set(throttleKey, now);
      await sendSlackAlert(email, condition.severity);
      console.log(`Alert sent for ${deliveryId}: ${email.subject}`);
      break;
    }
  }

  res.sendStatus(200);
});

app.listen(3000);

Common Pitfalls

Email alert pitfalls:

  • No alert throttling. If a failing system sends 100 error emails per minute, your team receives 100 Slack messages. Implement throttling (one alert per N minutes per sender/condition) to prevent notification storms.
  • Alert handler itself going down. If your alert handler is unavailable, critical alerts do not fire. Deploy the alert handler with higher availability than regular services, or use a managed function (Lambda) that cannot go down for regular deployments.
  • Too broad or too narrow conditions. Overly broad conditions flood channels with noise; overly narrow conditions miss legitimate alerts. Start broad, monitor, and tighten conditions based on observed false positives and false negatives.
  • Not deduplicating across retries. If your Slack post fails and the webhook is retried, you may send duplicate alerts. Add idempotency using the deliveryId before firing any external alert.
  • Alerts only in one channel. For critical alerts, send to multiple channels (Slack + PagerDuty) to ensure at least one reaches the on-call engineer. Single-channel alerting has a single point of failure.

Frequently Asked Questions

Can I trigger PagerDuty incidents from inbound emails?

Yes. When the alert condition matches, POST to the PagerDuty Events API v2 with your integration key. Include the email subject as the alert summary and the textBody as the details. PagerDuty will create an incident, route to the on-call engineer, and handle escalation automatically.

How do I avoid alert fatigue from repetitive monitoring emails?

Implement per-sender or per-condition throttling with a cooldown period (e.g., 5 minutes). Group similar alerts into a single message when multiple arrive within a short window. Consider severity tiering — low-severity alerts go to a dedicated low-noise channel while high-severity alerts page on-call immediately.

Can I set up email alerts without writing code?

Yes. Route your JsonHook webhook to Zapier or Make and use their Slack, PagerDuty, or SMS action steps with conditional logic. No code required for simple alert conditions. For complex logic (throttling, severity tiers), a custom webhook handler gives you more control.

What if the Slack API is down when an alert fires?

Wrap your Slack API call in a retry loop with exponential backoff. Store alert events in a queue (BullMQ, SQS) so that if the initial Slack API call fails, the job is retried automatically. For critical alerts, send to a backup channel (email to on-call's phone number via SMS API) as a fallback.