Merge branch 'master' into postgres-query-builder

This commit is contained in:
Sven Klemm
2018-09-03 22:05:53 +02:00
38 changed files with 832 additions and 207 deletions

View File

@ -16,12 +16,11 @@ weight = 2
When an alert changes state, it sends out notifications. Each alert rule can have
multiple notifications. In order to add a notification to an alert rule you first need
to add and configure a `notification` channel (can be email, PagerDuty or other integration). This is done from the Notification Channels page.
to add and configure a `notification` channel (can be email, PagerDuty or other integration).
This is done from the Notification Channels page.
## Notification Channel Setup
{{< imgbox max-width="30%" img="/img/docs/v50/alerts_notifications_menu.png" caption="Alerting Notification Channels" >}}
On the Notification Channels page hit the `New Channel` button to go the page where you
can configure and setup a new Notification Channel.
@ -30,7 +29,31 @@ sure it's setup correctly.
### Send on all alerts
When checked, this option will nofity for all alert rules - existing and new.
When checked, this option will notify for all alert rules - existing and new.
### Send reminders
> Only available in Grafana v5.3 and above.
{{< docs-imagebox max-width="600px" img="/img/docs/v53/alerting_notification_reminders.png" class="docs-image--right" caption="Alerting notification reminders setup" >}}
When this option is checked additional notifications (reminders) will be sent for triggered alerts. You can specify how often reminders
should be sent using number of seconds (s), minutes (m) or hours (h), for example `30s`, `3m`, `5m` or `1h` etc.
**Important:** Alert reminders are sent after rules are evaluated. Therefore a reminder can never be sent more frequently than a configured [alert rule evaluation interval](/alerting/rules/#name-evaluation-interval).
These examples show how often and when reminders are sent for a triggered alert.
Alert rule evaluation interval | Send reminders every | Reminder sent every (after last alert notification)
---------- | ----------- | -----------
`30s` | `15s` | ~30 seconds
`1m` | `5m` | ~5 minutes
`5m` | `15m` | ~15 minutes
`6m` | `20m` | ~24 minutes
`1h` | `15m` | ~1 hour
`1h` | `2h` | ~2 hours
<div class="clearfix"></div>
## Supported Notification Types
@ -132,23 +155,23 @@ Once these two properties are set, you can send the alerts to Kafka for further
### All supported notifiers
Name | Type |Support images
-----|------------ | ------
Slack | `slack` | yes
Pagerduty | `pagerduty` | yes
Email | `email` | yes
Webhook | `webhook` | link
Kafka | `kafka` | no
Hipchat | `hipchat` | yes
VictorOps | `victorops` | yes
Sensu | `sensu` | yes
OpsGenie | `opsgenie` | yes
Threema | `threema` | yes
Pushover | `pushover` | no
Telegram | `telegram` | no
Line | `line` | no
Prometheus Alertmanager | `prometheus-alertmanager` | no
Microsoft Teams | `teams` | yes
Name | Type |Support images | Support reminders
-----|------------ | ------ | ------ |
Slack | `slack` | yes | yes
Pagerduty | `pagerduty` | yes | yes
Email | `email` | yes | yes
Webhook | `webhook` | link | yes
Kafka | `kafka` | no | yes
Hipchat | `hipchat` | yes | yes
VictorOps | `victorops` | yes | yes
Sensu | `sensu` | yes | yes
OpsGenie | `opsgenie` | yes | yes
Threema | `threema` | yes | yes
Pushover | `pushover` | no | yes
Telegram | `telegram` | no | yes
Line | `line` | no | yes
Microsoft Teams | `teams` | yes | yes
Prometheus Alertmanager | `prometheus-alertmanager` | no | no

View File

@ -88,6 +88,11 @@ So as you can see from the above scenario Grafana will not send out notification
to fire if the rule already is in state `Alerting`. To improve support for queries that return multiple series
we plan to track state **per series** in a future release.
> Starting with Grafana v5.3 you can configure reminders to be sent for triggered alerts. This will send additional notifications
> when an alert continues to fire. If other series (like server2 in the example above) also cause the alert rule to fire they will
> be included in the reminder notification. Depending on what notification channel you're using you may be able to take advantage
> of this feature for identifying new/existing series causing alert to fire. [Read more about notification reminders here](/alerting/notifications/#send-reminders).
### No Data / Null values
Below your conditions you can configure how the rule evaluation engine should handle queries that return no data or only null values.

View File

@ -50,6 +50,7 @@ Authorization: Bearer eyJrIjoiT0tTcG1pUlY2RnVKZTFVaDFsNFZXdE9ZWmNrMkZYbk
```http
HTTP/1.1 200
Content-Type: application/json
```
## Get one alert
@ -86,6 +87,7 @@ Authorization: Bearer eyJrIjoiT0tTcG1pUlY2RnVKZTFVaDFsNFZXdE9ZWmNrMkZYbk
**Example Request**:
```http
POST /api/alerts/1/pause HTTP/1.1
Accept: application/json
Content-Type: application/json
@ -146,6 +148,7 @@ JSON Body Schema:
**Example Response**:
```http
HTTP/1.1 200
Content-Type: application/json
@ -177,6 +180,7 @@ JSON Body Schema:
```
## Update alert notification
`PUT /api/alert-notifications/1`
**Example Request**:
@ -204,14 +208,21 @@ Authorization: Bearer eyJrIjoiT0tTcG1pUlY2RnVKZTFVaDFsNFZXdE9ZWmNrMkZYbk
**Example Request**:
```http
**Example Request**:
```http
DELETE /api/alert-notifications/1 HTTP/1.1
Accept: application/json
Content-Type: application/json
Authorization: Bearer eyJrIjoiT0tTcG1pUlY2RnVKZTFVaDFsNFZXdE9ZWmNrMkZYbk
DELETE /api/alert-notifications/1 HTTP/1.1
Accept: application/json
Content-Type: application/json
Authorization: Bearer eyJrIjoiT0tTcG1pUlY2RnVKZTFVaDFsNFZXdE9ZWmNrMkZYbk
```
**Example Response**:
```http
HTTP/1.1 200
Content-Type: application/json
```
]
```
## Create alert notification
@ -232,6 +243,7 @@ Authorization: Bearer eyJrIjoiT0tTcG1pUlY2RnVKZTFVaDFsNFZXdE9ZWmNrMkZYbk
"name": "new alert notification", //Required
"type": "email", //Required
"isDefault": false,
"sendReminder": false,
"settings": {
"addresses": "carl@grafana.com;dev@grafana.com"
}
@ -243,14 +255,18 @@ Authorization: Bearer eyJrIjoiT0tTcG1pUlY2RnVKZTFVaDFsNFZXdE9ZWmNrMkZYbk
```http
HTTP/1.1 200
Content-Type: application/json
{
"id": 1,
"name": "new alert notification",
"type": "email",
"isDefault": false,
"settings": { addresses: "carl@grafana.com;dev@grafana.com"} }
"created": "2017-01-01 12:34",
"updated": "2017-01-01 12:34"
"sendReminder": false,
"settings": {
"addresses": "carl@grafana.com;dev@grafana.com"
},
"created": "2018-04-23T14:44:09+02:00",
"updated": "2018-08-20T15:47:49+02:00"
}
```
@ -271,6 +287,8 @@ Authorization: Bearer eyJrIjoiT0tTcG1pUlY2RnVKZTFVaDFsNFZXdE9ZWmNrMkZYbk
"name": "new alert notification", //Required
"type": "email", //Required
"isDefault": false,
"sendReminder": true,
"frequency": "15m",
"settings": {
"addresses: "carl@grafana.com;dev@grafana.com"
}
@ -282,12 +300,17 @@ Authorization: Bearer eyJrIjoiT0tTcG1pUlY2RnVKZTFVaDFsNFZXdE9ZWmNrMkZYbk
```http
HTTP/1.1 200
Content-Type: application/json
{
"id": 1,
"name": "new alert notification",
"type": "email",
"isDefault": false,
"settings": { addresses: "carl@grafana.com;dev@grafana.com"} }
"sendReminder": true,
"frequency": "15m",
"settings": {
"addresses": "carl@grafana.com;dev@grafana.com"
},
"created": "2017-01-01 12:34",
"updated": "2017-01-01 12:34"
}
@ -311,6 +334,7 @@ Authorization: Bearer eyJrIjoiT0tTcG1pUlY2RnVKZTFVaDFsNFZXdE9ZWmNrMkZYbk
```http
HTTP/1.1 200
Content-Type: application/json
{
"message": "Notification deleted"
}

View File

@ -109,3 +109,11 @@ positioning system when you load them in v5. Dashboards saved in v5 will not wor
external panel plugins might need to be updated to work properly.
For more details on the new panel positioning system, [click here]({{< relref "reference/dashboard.md#panel-size-position" >}})
## Upgrading to v5.2
One of the database migrations included in this release will update all annotation timestamps from second to millisecond precision. If you have a large amount of annotations the database migration may take a long time to complete which may cause problems if you use systemd to run Grafana.
We've got one report where using systemd, PostgreSQL and a large amount of annotations (table size 1645mb) took 8-20 minutes for the database migration to complete. However, the grafana-server process was killed after 90 seconds by systemd. Any database migration queries in progress when systemd kills the grafana-server process continues to execute in database until finished.
If you're using systemd and have a large amount of annotations consider temporary adjusting the systemd `TimeoutStartSec` setting to something high like `30m` before upgrading.