Compare commits

..

18 Commits

Author SHA1 Message Date
George Robinson
fe17b64445 Backport Fix XSS in runbook URL (#681) to v9.2.x (#683)
(cherry picked from commit db1548c1491c2f5b522e3c0ceb1832b914a4b2f0)
(cherry picked from commit 3135a81edf0ebeb575c95560dd548f9589c14d02)
2022-11-29 17:51:34 +01:00
Grot (@grafanabot)
c98714f77f [v9.2.x] Docs: Add docs for labels with dots (#59486)
Docs: Add docs for labels with dots (#59352)

(cherry picked from commit c8c1499cd0)

Co-authored-by: George Robinson <george.robinson@grafana.com>
2022-11-29 13:38:16 +00:00
Grot (@grafanabot)
ac9819bfa5 [v9.2.x] Fix: Allow creating snapshot with no dashboard id (#59465)
Fix: Allow creating snapshot with no dashboard id (#58669)

(cherry picked from commit d279b6d7b0)

Co-authored-by: Gabriel MABILLE <gamab@users.noreply.github.com>
2022-11-29 10:09:08 +01:00
Grot (@grafanabot)
6b4394d8e6 [v9.2.x] SQL Datasources: Fix annotation migration (#59454)
SQL Datasources: Fix annotation migration (#59438)

(cherry picked from commit 71e4a8261d)

Co-authored-by: Zoltán Bedi <zoltan.bedi@gmail.com>
2022-11-29 01:25:25 -06:00
Marcus Efraimsson
9b7ad3d663 [v9.2.x] SSE: Make sure to forward headers, user and cookies/OAuth token (#58897) (#59430)
SSE: Make sure to forward headers, user and cookies/OAuth token (#58897)

Fixes #58793 and Fixes https://github.com/grafana/azure-data-explorer-datasource/issues/513

Co-authored-by: Marcus Efraimsson <marcus.efraimsson@gmail.com>
(cherry picked from commit 5623b5afaf)

Co-authored-by: Kyle Brandt <kyle@grafana.com>
2022-11-28 19:06:59 +01:00
George Robinson
8bb5b17692 Docs: Improve docs for images in notifications (#59033) (#59404)
(cherry picked from commit 0af3515e95)
2022-11-28 19:06:41 +01:00
Grot (@grafanabot)
12fc64b389 [v9.2.x] BarChart: fix hover overlay for hz stacked (#59397)
BarChart: fix hover overlay for hz stacked (#59359)

(cherry picked from commit 13d5ad2ce2)

Co-authored-by: Leon Sorokin <leeoniya@gmail.com>
2022-11-28 09:00:33 -05:00
Grot (@grafanabot)
052fd5713d [v9.2.x] Navigation: Fix crash when Help is disabled (#59375)
Navigation: Set navtree to  an empty array instead of null (#58919)

set navtree to  an empty array instead of null

(cherry picked from commit 4aa5dea96b)

Co-authored-by: Leo <108552997+lpskdl@users.noreply.github.com>
2022-11-28 06:54:07 -05:00
Ieva
6b64e4d192 Access Control: Clear user's permission cache after resource creation (#59318)
resolve merge conflicts
2022-11-24 18:10:56 +01:00
Grot (@grafanabot)
c89876323c [v9.2.x] TimeseriesPanel: Preserve string fields for data link interpolation (#59296)
TimeseriesPanel: Preserve string fields for data link interpolation (#58424)

* TimeseriesPanel: Preserve string fields for data link interpolation

* clean code

* Modify tests so that string fields are allowed only when a valid time/number dataframe exists

* performance mods

* fix wrong length

* remove console.log

* Check if aligned dataframe has links

(cherry picked from commit 0da77201bf)

Co-authored-by: Victor Marin <36818606+mdvictor@users.noreply.github.com>
2022-11-24 08:15:06 -05:00
Grot (@grafanabot)
24659cc117 [v9.2.x] PostgreSQL: Fix missing CA field from configuration (#59285)
PostgreSQL: Fix missing CA field from configuration (#59280)

* PostgreSQL: Fix missing CA field from configuration

(cherry picked from commit be73418d00)

Co-authored-by: Oscar Kilhed <oscar.kilhed@grafana.com>
2022-11-24 13:33:20 +01:00
Grot (@grafanabot)
9fef15403c [v9.2.x] Azure Monitor: Fix empty/errored responses for Logs variables (#59277)
Azure Monitor: Fix empty/errored responses for Logs variables (#59240)

(cherry picked from commit 276b54fe9d)

Co-authored-by: Andres Martinez Gotor <andres.martinez@grafana.com>
2022-11-24 05:42:12 -05:00
Grot (@grafanabot)
6fbdc2ed89 [v9.2.x] Heatmap: Fix blurry text & rendering (#59261)
Heatmap: Fix blurry text & rendering (#59260)

(cherry picked from commit 6f00bc5674)

Co-authored-by: Leon Sorokin <leeoniya@gmail.com>
2022-11-23 23:12:12 -05:00
Grot (@grafanabot)
09308c77d8 Release: Bump version to 9.2.7 (#59245)
"Release: Updated versions in package to 9.2.7"
2022-11-23 11:07:57 -06:00
owensmallwood
29f26b5a02 Changelog: Updated changelog for 9.2.6 (#59232) (#59244)
(cherry picked from commit ae508c12f3)

# Conflicts:
#	CHANGELOG.md

Co-authored-by: Grot (@grafanabot) <43478413+grafanabot@users.noreply.github.com>
2022-11-23 10:53:48 -06:00
Sven Grossmann
e6a7b53703 [9.2.x] Fix #58598 X-ID-Token header missing on Loki Datasource (#58784) (#59196)
* Fix #58598 X-ID-Token header missing on Loki Datasource (#58784)

* Fix #58598 X-ID-Token header missing on Loki Datasource

* Remove unecessary continue statements

* Add getAuthHeadersForCallResource unit tests

* Fix test and switch statement issues introduced during merge

(cherry picked from commit f1ef63791a)

* update test

* missed linting

Co-authored-by: Yann Vigara <yvigara@users.noreply.github.com>
2022-11-23 12:52:35 +01:00
Zoltán Bedi
37236d6ab6 [v9.2.x] SQL: Fix code editor for SQL datasources (#59189)
SQL: Fix code editor for SQL datasources (#58116)

* SQL: Fix code editor for sql datasources

* Fix: mysql completion with defaultdb

(cherry picked from commit 75097b99fb)
2022-11-23 05:28:32 -05:00
Grot (@grafanabot)
a47cfa9b28 [v9.2.x] Azure Monitor: Fix resource picker selection for subresources (#59136)
Azure Monitor: Fix resource picker selection for subresources (#56392)

(cherry picked from commit 5b1ff83ee9)

Co-authored-by: Andres Martinez Gotor <andres.martinez@grafana.com>
2022-11-22 10:55:30 -05:00
69 changed files with 1098 additions and 878 deletions

View File

@@ -1,3 +1,20 @@
<!-- 9.2.6 START -->
# 9.2.6 (2022-11-22)
### Features and enhancements
- **Alerting:** Support Prometheus durations in Provisioning API. [#58293](https://github.com/grafana/grafana/pull/58293), [@bartpeeters](https://github.com/bartpeeters)
- **SSE:** Keep value name from numeric table. [#58831](https://github.com/grafana/grafana/pull/58831), [@kylebrandt](https://github.com/kylebrandt)
- **Transformations:** Make Card Descriptions Clickable. [#58717](https://github.com/grafana/grafana/pull/58717), [@zuchka](https://github.com/zuchka)
### Bug fixes
- **MS/My/PostgresSQL:** Migrate annotation query. [#58847](https://github.com/grafana/grafana/pull/58847), [@zoltanbedi](https://github.com/zoltanbedi)
- **Search:** Fixes issue with Recent/Starred section always displaying "General" folder. [#58746](https://github.com/grafana/grafana/pull/58746), [@JoaoSilvaGrafana](https://github.com/JoaoSilvaGrafana)
- **Server:** Write internal server error on missing write. [#57813](https://github.com/grafana/grafana/pull/57813), [@sakjur](https://github.com/sakjur)
<!-- 9.2.6 END -->
<!-- 9.2.5 START -->
# 9.2.5 (2022-11-16)

View File

@@ -99,3 +99,17 @@ The following template variables are available when expanding annotations and la
| $labels | The labels from the query or condition. For example, `{{ $labels.instance }}` and `{{ $labels.job }}`. This is unavailable when the rule uses a [classic condition]({{< relref "../../alerting-rules/create-grafana-managed-rule/#single-and-multi-dimensional-rule" >}}). |
| $values | The values of all reduce and math expressions that were evaluated for this alert rule. For example, `{{ $values.A }}`, `{{ $values.A.Labels }}` and `{{ $values.A.Value }}` where `A` is the `refID` of the reduce or math expression. If the rule uses a classic condition instead of a reduce and math expression, then `$values` contains the combination of the `refID` and position of the condition. |
| $value | The value string of the alert instance. For example, `[ var='A' labels={instance=foo} value=10 ]`. |
### Labels with dots
If a label contains a dot (full stop or period) in its name then the following will not work:
```
Instance {{ $labels.instance.name }} has been down for more than 5 minutes
```
This is because we are printing a non-existing field `name` in `$labels.instance` rather than `instance.name` in `$labels`. Instead we can use the `index` function to print `instance.name`:
```
Instance {{ index $labels "instance.name" }} has been down for more than 5 minutes
```

View File

@@ -6,39 +6,64 @@ keywords:
- alerting
- images
- notifications
title: Images in notifications
title: Use images in notifications
weight: 460
---
# Images in notifications
# Use images in notifications
Images in notifications helps recipients of alert notifications better understand why an alert has fired or resolved by including an image of the panel associated with the Grafana managed alert rule.
Images in notifications helps recipients of alert notifications better understand why an alert has fired or resolved by including a screenshot of the panel associated with the alert.
> **Note**: Images in notifications are not available for Grafana Mimir and Loki managed alert rules, or when Grafana is set up to send alert notifications to an external Alertmanager.
> **Note**: This feature is not supported for Mimir or Loki rules, or when Grafana sends alert notifications to an external Alertmanager.
If Grafana is set up to send images in notifications, it takes a screenshot of the panel for the Grafana managed alert rule when either of the following happen:
When an alert is fired or resolved Grafana takes a screenshot of the panel associated with the alert. This is determined via the Dashboard UID and Panel ID annotations of the rule. Grafana cannot take a screenshot for alerts that are not associated with a panel.
1. The alert rule transitions from pending to firing
2. The alert rule transitions from firing to OK
Because a number of contact points, such as email, do not support uploading screenshots at the time of sending a notification; Grafana can also upload the screenshot to a cloud storage service such as Amazon S3, Azure Blob Storage and Google Cloud Storage, where a link to the uploaded screenshot can be added to the notification. However, if using a cloud storage service is not an option then Grafana can be its own cloud storage service such that the screenshot is available under the same domain as Grafana.
Grafana does not support images for alert rules that are not associated with a panel. An alert rule is associated with a panel when it has both Dashboard UID and Panel ID annotations.
Should either the cloud storage service, or Grafana if acting as its own cloud storage service, be protected by a firewall, gateway service or VPN, then screenshots might not be shown in notifications.
Images are stored in the [data]({{< relref "../setup-grafana/configure-grafana/#paths" >}}) path and so Grafana must have write-access to this path. If Grafana cannot write to this path then screenshots cannot be saved to disk and an error will be logged for each failed screenshot attempt. In addition to storing images on disk, Grafana can also store the image in an external image store such as Amazon S3, Azure Blob Storage, Google Cloud Storage and even Grafana where screenshots are stored in `public/img/attachments`. Screenshots older than `temp_data_lifetime` are deleted from disk but not the external image store. If Grafana is the external image store then screenshots are deleted from `data` but not from `public/img/attachments`.
How to choose between uploading screenshots at the time of sending the notification, using a cloud storage service, or using Grafana as its own cloud storage service, depends on which contact points you plan to use and whether you use a firewall, gateway service or VPN.
> **Note**: It is recommended that you use an external image store, as not all contact points support uploading images from disk. It is also possible that the image on disk is deleted before an alert notification is sent if `temp_data_lifetime` is less than the `group_wait` and `group_interval` options used in Alertmanager.
For example, if a contact point supports uploading images at the time of notification is it not required to use cloud storage. Cloud storage is required when a contact point does not support uploading images at the time of sending a notification, such as email. We don't recommend using cloud storage if the cloud storage service is behind a firewall, gateway service, or VPN, as screenshots might not be shown in notifications.
Please refer to the table at the end of this page for a list of contact points and their support for images in notifications.
## Requirements
To use images in notifications, Grafana must be set up to use [image rendering]({{< relref "../setup-grafana/image-rendering/" >}}). It is also recommended that Grafana is set up to upload images to an [external image store]({{< relref "../setup-grafana/configure-grafana/#external_image_storage" >}}) such as Amazon S3, Azure Blob Storage, Google Cloud Storage or even Grafana.
To use images in notifications, Grafana must be set up to use [image rendering](https://grafana.com/docs/grafana/next/setup-grafana/image-rendering/). You can either install the image rendering plugin or run it as a remote rendering service.
When a screenshot is taken it is saved to the [data]({{< relref "../../setup-grafana/configure-grafana/#paths" >}}) path. This is where screenshots are stored before being sent in a notification or uploaded to a cloud storage service. Grafana must have write-access to this path. If Grafana cannot write to this path then screenshots cannot be saved to disk and an error will be logged for each failed screenshot attempt.
If using a [cloud storage service](https://grafana.com/docs/grafana/latest/setup-grafana/configure-grafana/#external_image_storage) such as Amazon S3, Azure Blob Storage or Google Cloud Storage, uploaded images need to be accessible outside of a firewall, gateway service or VPN for screenshots to be shown in notifications. Grafana will not delete screenshots from cloud storage. We recommend configuring a retention policy on the bucket to delete screenshots older than 1 month.
If using Grafana as its own cloud storage service then screenshots will be saved to `static_root_path/img/attachments`. `static_root_path` is a configuration option for Grafana and can be found in `defaults.ini`. However, like when using a cloud storage service, images need to be accessible outside of a firewall, gateway service or VPN for screenshots to be shown in notifications.
When using Grafana as its own cloud storage service screenshots are copied from [data]({{< relref "../../setup-grafana/configure-grafana/#paths" >}}) to `static_root_path/img/attachments`. Screenshots older than `temp_data_lifetime` are deleted from [data]({{< relref "../../setup-grafana/configure-grafana/#paths" >}}) but not from `static_root_path/images/attachments`. To delete screenshots from `static_root_path` after a certain amount of time we recommend setting up a CRON job.
## Configuration
If Grafana has been set up to use [image rendering]({{< relref "../setup-grafana/image-rendering/" >}}) images in notifications can be turned on via the `capture` option in `[unified_alerting.screenshots]`:
Having installed either the image rendering plugin, or set up Grafana to use a remote rendering service, set `capture` in `[unified_alerting.screenshots]` to `true`:
# Enable screenshots in notifications. This option requires the Grafana Image Renderer plugin.
# For more information on configuration options, refer to [rendering].
capture = true
capture = false
It is recommended that `max_concurrent_screenshots` is set to a value that is less than or equal to `concurrent_render_request_limit`. The default value for both `max_concurrent_screenshots` and `concurrent_render_request_limit` is `5`:
If screenshots should be uploaded to cloud storage then `upload_external_image_storage` should also be set to `true`:
# Uploads screenshots to the local Grafana server or remote storage such as Azure, S3 and GCS. Please
# see [external_image_storage] for further configuration options. If this option is false, screenshots
# will be persisted to disk for up to temp_data_lifetime.
upload_external_image_storage = false
Please see [`[external_image_storage]`](https://grafana.com/docs/grafana/latest/setup-grafana/configure-grafana/#external_image_storage) for instructions on how to configure cloud storage. Grafana will not start if `upload_external_image_storage` is `true` and `[external_image_storage]` contains missing or invalid configuration.
If Grafana is acting as its own cloud storage then `[upload_external_image_storage]` should be set to `true` and the `local` provider should be set in [`[external_image_storage]`](https://grafana.com/docs/grafana/latest/setup-grafana/configure-grafana/#external_image_storage).
Restart Grafana for the changes to take effect.
## Advanced configuration
We recommended that `max_concurrent_screenshots` is less than or equal to `concurrent_render_request_limit`. The default value for both `max_concurrent_screenshots` and `concurrent_render_request_limit` is `5`:
# The maximum number of screenshots that can be taken at the same time. This option is different from
# concurrent_render_request_limit as max_concurrent_screenshots sets the number of concurrent screenshots
@@ -46,40 +71,50 @@ It is recommended that `max_concurrent_screenshots` is set to a value that is le
# the total number of concurrent screenshots across all Grafana services.
max_concurrent_screenshots = 5
If Grafana has been set up to use an external image store, `upload_external_image_storage` should be set to `true`:
## Support for images in contact points
# Uploads screenshots to the local Grafana server or remote storage such as Azure, S3 and GCS. Please
# see [external_image_storage] for further configuration options. If this option is false, screenshots
# will be persisted to disk for up to temp_data_lifetime.
upload_external_image_storage = false
Grafana supports a wide range of contact points with varied support for images in notifications. The table below shows the list of all contact points supported in Grafana and their support for uploading images at the time of sending the notification and images uploaded to cloud storage, including when Grafana is acting as its own cloud storage service.
Restart Grafana for the changes to take effect.
| Name | Upload image at time of notification | Cloud storage |
| ----------------------- | ------------------------------------ | ------------- |
| DingDing | No | No |
| Discord | Yes | Yes |
| Email | Yes | Yes |
| Google Hangouts Chat | No | Yes |
| Kafka | No | No |
| Line | No | No |
| Microsoft Teams | No | Yes |
| Opsgenie | No | Yes |
| Pagerduty | No | Yes |
| Prometheus Alertmanager | No | No |
| Pushover | Yes | No |
| Sensu Go | No | No |
| Slack | No (will be available in 9.4) | Yes |
| Telegram | Yes | No |
| Threema | No | No |
| VictorOps | No | No |
| Webhook | No | Yes |
| Cisco Webex Teams | No | Yes |
## Supported notifiers
## Limitations
Images in notifications are supported in the following notifiers and additional support will be added in the future:
- This feature is not supported for Mimir or Loki rules, or when Grafana sends alert notifications to an external Alertmanager.
- When multiple alerts are sent in a single notification a screenshot might be included for each alert. The order the images are shown in random.
- Some contact points support at most one image per notification. In this case, the first image associated with an alert will be attached.
- We don't recommend using cloud storage if the cloud storage service is behind a firewall, gateway service, or VPN, as screenshots might not be shown in notifications.
| Name | Upload images from disk | Include images from URL |
| ----------------------- | ----------------------- | ----------------------- |
| DingDing | No | No |
| Discord | Yes | Yes |
| Email | Yes | Yes |
| Google Hangouts Chat | No | Yes |
| Kafka | No | No |
| Line | No | No |
| Microsoft Teams | No | Yes |
| Opsgenie | No | Yes |
| Pagerduty | No | Yes |
| Prometheus Alertmanager | No | No |
| Pushover | Yes | No |
| Sensu Go | No | No |
| Slack | No | Yes |
| Telegram | No | No |
| Threema | No | No |
| VictorOps | No | No |
| Webhook | No | Yes |
## Troubleshooting
Include images from URL refers to using the external image store.
If Grafana has been set up to send images in notifications, however notifications are still being received without them, follow the troubleshooting steps below:
1. Check that images in notifications has been set up as per the instructions.
2. Enable debug logging in Grafana and look for logs with the logger `ngalert.image`.
3. If the alert is not associated with a dashboard there will be logs for `Cannot take screenshot for alert rule as it is not associated with a dashboard`.
4. If the alert is associated with a dashboard, but no panel in the dashboard, there will be logs for `Cannot take screenshot for alert rule as it is not associated with a panel`.
5. If images cannot be taken because of mis-configuration or an issue with image rendering there will be logs for `Failed to take an image` including the Dashboard UID, Panel ID, and the error message.
6. Check that the contact point supports images in notifications, and the present configuration, as per the table.
7. If the image was uploaded to cloud storage make sure it is public.
8. If images are made available via Grafana's built in web server make sure it is accessible via the Internet.
## Metrics
@@ -93,20 +128,3 @@ For example, if a screenshot could not be taken within the expected time (10 sec
- `grafana_screenshot_successes_total`
- `grafana_screenshot_upload_failures_total`
- `grafana_screenshot_upload_successes_total`
## Limitations
- Images in notifications are not available for Grafana Mimir and Loki managed alert rules, or when Grafana is set up to send alert notifications to an external Alertmanager.
- When alerts generated by different alert rules are sent in a single notification, there may be screenshots for each alert rule. This happens if an alert group contains multiple alerting rules. The order the images are attached is random. If you need to guarantee the ordering of images, make sure that your alert groups contain a single alerting rule.
- Some contact points only handle a single image. In this case, the first image associated with an alert will be attached. Because the ordering is random, this may not always be an image for the same alert rule. If you need to guarantee you receive a screenshot for a particular rule, make sure that your alert groups contain a single alerting rule.
## Troubleshooting
If Grafana has been set up to send images in notifications, however notifications are still being received without images, then check the troubleshooting instructions below.
1. Check that Grafana has been set up as per the instructions
2. Check that the notifier supports images in notifications as per the table of supported notifiers
3. Enable debug logging in Grafana and look for logs with the logger `ngalert.image`
4. If the alert is not associated with a dashboard there will be logs for `Cannot take screenshot for alert rule as it is not associated with a dashboard`
5. If the alert is associated with a dashboard, but no panel in the dashboard, there will be logs for `Cannot take screenshot for alert rule as it is not associated with a panel`
6. If images cannot be taken because of mis-configuration or an issue with image rendering there will be logs for `Failed to take an image` including the Dashboard UID, Panel ID, and the error message

View File

@@ -4,5 +4,5 @@
"packages": [
"packages/*"
],
"version": "9.2.6"
"version": "9.2.7"
}

View File

@@ -3,7 +3,7 @@
"license": "AGPL-3.0-only",
"private": true,
"name": "grafana",
"version": "9.2.6",
"version": "9.2.7",
"repository": "github:grafana/grafana",
"scripts": {
"api-tests": "jest --notify --watch --config=devenv/e2e-api-tests/jest.js",

View File

@@ -2,7 +2,7 @@
"author": "Grafana Labs",
"license": "Apache-2.0",
"name": "@grafana/data",
"version": "9.2.6",
"version": "9.2.7",
"description": "Grafana Data Library",
"keywords": [
"typescript"
@@ -34,7 +34,7 @@
},
"dependencies": {
"@braintree/sanitize-url": "6.0.0",
"@grafana/schema": "9.2.6",
"@grafana/schema": "9.2.7",
"@types/d3-interpolate": "^1.4.0",
"d3-interpolate": "1.4.0",
"date-fns": "2.29.1",

View File

@@ -23,4 +23,8 @@ export class SortedVector<T = any> implements Vector<T> {
toJSON(): T[] {
return vectorToArray(this);
}
getOrderArray(): number[] {
return this.order;
}
}

View File

@@ -2,7 +2,7 @@
"author": "Grafana Labs",
"license": "Apache-2.0",
"name": "@grafana/e2e-selectors",
"version": "9.2.6",
"version": "9.2.7",
"description": "Grafana End-to-End Test Selectors Library",
"keywords": [
"cli",

View File

@@ -2,7 +2,7 @@
"author": "Grafana Labs",
"license": "Apache-2.0",
"name": "@grafana/e2e",
"version": "9.2.6",
"version": "9.2.7",
"description": "Grafana End-to-End Test Library",
"keywords": [
"cli",
@@ -61,7 +61,7 @@
"@babel/core": "7.19.0",
"@babel/preset-env": "7.19.0",
"@cypress/webpack-preprocessor": "5.12.0",
"@grafana/e2e-selectors": "9.2.6",
"@grafana/e2e-selectors": "9.2.7",
"@grafana/tsconfig": "^1.2.0-rc1",
"@mochajs/json-file-reporter": "^1.2.0",
"babel-loader": "8.2.5",

View File

@@ -2,7 +2,7 @@
"author": "Grafana Labs",
"license": "Apache-2.0",
"name": "@grafana/runtime",
"version": "9.2.6",
"version": "9.2.7",
"description": "Grafana Runtime Library",
"keywords": [
"grafana",
@@ -36,9 +36,9 @@
},
"dependencies": {
"@grafana/agent-web": "^0.4.0",
"@grafana/data": "9.2.6",
"@grafana/e2e-selectors": "9.2.6",
"@grafana/ui": "9.2.6",
"@grafana/data": "9.2.7",
"@grafana/e2e-selectors": "9.2.7",
"@grafana/ui": "9.2.7",
"@sentry/browser": "6.19.7",
"history": "4.10.1",
"lodash": "4.17.21",

View File

@@ -2,7 +2,7 @@
"author": "Grafana Labs",
"license": "Apache-2.0",
"name": "@grafana/schema",
"version": "9.2.6",
"version": "9.2.7",
"description": "Grafana Schema Library",
"keywords": [
"typescript"

View File

@@ -2,7 +2,7 @@
"author": "Grafana Labs",
"license": "Apache-2.0",
"name": "@grafana/toolkit",
"version": "9.2.6",
"version": "9.2.7",
"description": "Grafana Toolkit",
"keywords": [
"grafana",
@@ -50,10 +50,10 @@
"@babel/preset-env": "7.18.9",
"@babel/preset-react": "7.18.6",
"@babel/preset-typescript": "7.18.6",
"@grafana/data": "9.2.6",
"@grafana/data": "9.2.7",
"@grafana/eslint-config": "5.0.0",
"@grafana/tsconfig": "^1.2.0-rc1",
"@grafana/ui": "9.2.6",
"@grafana/ui": "9.2.7",
"@jest/core": "27.5.1",
"@types/command-exists": "^1.2.0",
"@types/eslint": "8.4.1",

View File

@@ -2,7 +2,7 @@
"author": "Grafana Labs",
"license": "Apache-2.0",
"name": "@grafana/ui",
"version": "9.2.6",
"version": "9.2.7",
"description": "Grafana Components Library",
"keywords": [
"grafana",
@@ -47,9 +47,9 @@
"dependencies": {
"@emotion/css": "11.9.0",
"@emotion/react": "11.9.3",
"@grafana/data": "9.2.6",
"@grafana/e2e-selectors": "9.2.6",
"@grafana/schema": "9.2.6",
"@grafana/data": "9.2.7",
"@grafana/e2e-selectors": "9.2.7",
"@grafana/schema": "9.2.7",
"@monaco-editor/react": "4.4.5",
"@popperjs/core": "2.11.5",
"@react-aria/button": "3.6.1",

View File

@@ -65,8 +65,8 @@ export class UPlotChart extends Component<PlotProps, UPlotChartState> {
});
const config: Options = {
width: this.props.width,
height: this.props.height,
width: Math.floor(this.props.width),
height: Math.floor(this.props.height),
...this.props.config.getConfig(),
};
@@ -93,8 +93,8 @@ export class UPlotChart extends Component<PlotProps, UPlotChartState> {
if (!sameDims(prevProps, this.props)) {
plot?.setSize({
width: this.props.width,
height: this.props.height,
width: Math.floor(this.props.width),
height: Math.floor(this.props.height),
});
} else if (!sameConfig(prevProps, this.props)) {
this.reinitPlot();

View File

@@ -1,6 +1,6 @@
{
"name": "@jaegertracing/jaeger-ui-components",
"version": "9.2.6",
"version": "9.2.7",
"main": "src/index.ts",
"types": "src/index.ts",
"license": "Apache-2.0",
@@ -31,10 +31,10 @@
},
"dependencies": {
"@emotion/css": "11.9.0",
"@grafana/data": "9.2.6",
"@grafana/e2e-selectors": "9.2.6",
"@grafana/runtime": "9.2.6",
"@grafana/ui": "9.2.6",
"@grafana/data": "9.2.7",
"@grafana/e2e-selectors": "9.2.7",
"@grafana/runtime": "9.2.7",
"@grafana/ui": "9.2.7",
"chance": "^1.0.10",
"classnames": "^2.2.5",
"combokeys": "^3.0.0",

View File

@@ -24,6 +24,7 @@ import (
"github.com/grafana/grafana/pkg/models"
"github.com/grafana/grafana/pkg/services/accesscontrol"
"github.com/grafana/grafana/pkg/services/accesscontrol/acimpl"
"github.com/grafana/grafana/pkg/services/accesscontrol/actest"
accesscontrolmock "github.com/grafana/grafana/pkg/services/accesscontrol/mock"
"github.com/grafana/grafana/pkg/services/accesscontrol/ossaccesscontrol"
"github.com/grafana/grafana/pkg/services/annotations/annotationstest"
@@ -250,15 +251,16 @@ func setupAccessControlScenarioContext(t *testing.T, cfg *setting.Cfg, url strin
store := sqlstore.InitTestDB(t)
hs := &HTTPServer{
Cfg: cfg,
Live: newTestLive(t, store),
License: &licensing.OSSLicensingService{},
Features: featuremgmt.WithFeatures(),
QuotaService: &quotaimpl.Service{Cfg: cfg},
RouteRegister: routing.NewRouteRegister(),
AccessControl: accesscontrolmock.New().WithPermissions(permissions),
searchUsersService: searchusers.ProvideUsersService(filters.ProvideOSSSearchUserFilter(), usertest.NewUserServiceFake()),
ldapGroups: ldap.ProvideGroupsService(),
Cfg: cfg,
Live: newTestLive(t, store),
License: &licensing.OSSLicensingService{},
Features: featuremgmt.WithFeatures(),
QuotaService: &quotaimpl.Service{Cfg: cfg},
RouteRegister: routing.NewRouteRegister(),
AccessControl: accesscontrolmock.New().WithPermissions(permissions),
searchUsersService: searchusers.ProvideUsersService(filters.ProvideOSSSearchUserFilter(), usertest.NewUserServiceFake()),
ldapGroups: ldap.ProvideGroupsService(),
accesscontrolService: actest.FakeService{},
}
sc := setupScenarioContext(t, url)

View File

@@ -461,7 +461,7 @@ func (hs *HTTPServer) postDashboard(c *models.ReqContext, cmd models.SaveDashboa
}
if liveerr != nil {
hs.log.Warn("unable to broadcast save event", "uid", dashboard.Uid, "error", err)
hs.log.Warn("unable to broadcast save event", "uid", dashboard.Uid, "error", liveerr)
}
}
@@ -469,6 +469,12 @@ func (hs *HTTPServer) postDashboard(c *models.ReqContext, cmd models.SaveDashboa
return apierrors.ToDashboardErrorResponse(ctx, hs.pluginStore, err)
}
// Clear permission cache for the user who's created the dashboard, so that new permissions are fetched for their next call
// Required for cases when caller wants to immediately interact with the newly created object
if newDashboard && !hs.accesscontrolService.IsDisabled() {
hs.accesscontrolService.ClearUserPermissionCache(c.SignedInUser)
}
// connect library panels for this dashboard after the dashboard is stored and has an ID
err = hs.LibraryPanelService.ConnectLibraryPanelsForDashboard(ctx, c.SignedInUser, dashboard)
if err != nil {

View File

@@ -130,11 +130,6 @@ func (hs *HTTPServer) CreateDashboardSnapshot(c *models.ReqContext) response.Res
metrics.MApiDashboardSnapshotExternal.Inc()
} else {
if cmd.Dashboard.Get("id").MustInt64() == 0 {
c.JSON(http.StatusBadRequest, "Creating a local snapshot requires a dashboard")
return nil
}
if cmd.Key == "" {
var err error
cmd.Key, err = util.GetRandomString(32)

View File

@@ -20,6 +20,7 @@ import (
"github.com/grafana/grafana/pkg/framework/coremodel/registry"
"github.com/grafana/grafana/pkg/infra/usagestats"
"github.com/grafana/grafana/pkg/models"
"github.com/grafana/grafana/pkg/services/accesscontrol/actest"
accesscontrolmock "github.com/grafana/grafana/pkg/services/accesscontrol/mock"
"github.com/grafana/grafana/pkg/services/alerting"
"github.com/grafana/grafana/pkg/services/annotations/annotationstest"
@@ -1034,6 +1035,7 @@ func postDashboardScenario(t *testing.T, desc string, url string, routePattern s
folderService: folderService,
Features: featuremgmt.WithFeatures(),
Coremodels: registry.NewBase(),
accesscontrolService: actest.FakeService{},
}
sc := setupScenarioContext(t, url)
@@ -1106,6 +1108,7 @@ func restoreDashboardVersionScenario(t *testing.T, desc string, url string, rout
Features: featuremgmt.WithFeatures(),
dashboardVersionService: fakeDashboardVersionService,
Coremodels: registry.NewBase(),
accesscontrolService: actest.FakeService{},
}
sc := setupScenarioContext(t, url)

View File

@@ -396,6 +396,12 @@ func (hs *HTTPServer) AddDataSource(c *models.ReqContext) response.Response {
return response.Error(500, "Failed to add datasource", err)
}
// Clear permission cache for the user who's created the data source, so that new permissions are fetched for their next call
// Required for cases when caller wants to immediately interact with the newly created object
if !hs.AccessControl.IsDisabled() {
hs.accesscontrolService.ClearUserPermissionCache(c.SignedInUser)
}
ds := hs.convertModelToDtos(c.Req.Context(), cmd.Result)
return response.JSON(http.StatusOK, util.DynMap{
"message": "Datasource added",

View File

@@ -18,6 +18,8 @@ import (
"github.com/grafana/grafana/pkg/components/simplejson"
"github.com/grafana/grafana/pkg/models"
ac "github.com/grafana/grafana/pkg/services/accesscontrol"
"github.com/grafana/grafana/pkg/services/accesscontrol/acimpl"
"github.com/grafana/grafana/pkg/services/accesscontrol/actest"
"github.com/grafana/grafana/pkg/services/datasources"
"github.com/grafana/grafana/pkg/services/datasources/permissions"
"github.com/grafana/grafana/pkg/services/org"
@@ -111,7 +113,9 @@ func TestAddDataSource_URLWithoutProtocol(t *testing.T) {
DataSourcesService: &dataSourcesServiceMock{
expectedDatasource: &datasources.DataSource{},
},
Cfg: setting.NewCfg(),
Cfg: setting.NewCfg(),
AccessControl: acimpl.ProvideAccessControl(setting.NewCfg()),
accesscontrolService: actest.FakeService{},
}
sc := setupScenarioContext(t, "/api/datasources")
@@ -223,7 +227,9 @@ func TestUpdateDataSource_URLWithoutProtocol(t *testing.T) {
DataSourcesService: &dataSourcesServiceMock{
expectedDatasource: &datasources.DataSource{},
},
Cfg: setting.NewCfg(),
Cfg: setting.NewCfg(),
AccessControl: acimpl.ProvideAccessControl(setting.NewCfg()),
accesscontrolService: actest.FakeService{},
}
sc := setupScenarioContext(t, "/api/datasources/1234")

View File

@@ -123,6 +123,12 @@ func (hs *HTTPServer) CreateFolder(c *models.ReqContext) response.Response {
return apierrors.ToFolderErrorResponse(err)
}
// Clear permission cache for the user who's created the folder, so that new permissions are fetched for their next call
// Required for cases when caller wants to immediately interact with the newly created object
if !hs.AccessControl.IsDisabled() {
hs.accesscontrolService.ClearUserPermissionCache(c.SignedInUser)
}
g := guardian.New(c.Req.Context(), folder.Id, c.OrgID, c.SignedInUser)
return response.JSON(http.StatusOK, hs.toFolderDto(c, g, folder))
}

View File

@@ -16,6 +16,7 @@ import (
"github.com/grafana/grafana/pkg/api/routing"
"github.com/grafana/grafana/pkg/models"
"github.com/grafana/grafana/pkg/services/accesscontrol"
"github.com/grafana/grafana/pkg/services/accesscontrol/actest"
acmock "github.com/grafana/grafana/pkg/services/accesscontrol/mock"
"github.com/grafana/grafana/pkg/services/dashboards"
"github.com/grafana/grafana/pkg/services/featuremgmt"
@@ -247,10 +248,11 @@ func createFolderScenario(t *testing.T, desc string, url string, routePattern st
store := mockstore.NewSQLStoreMock()
guardian.InitLegacyGuardian(store, dashSvc, teamSvc)
hs := HTTPServer{
AccessControl: acmock.New(),
folderService: folderService,
Cfg: setting.NewCfg(),
Features: featuremgmt.WithFeatures(),
AccessControl: acmock.New(),
folderService: folderService,
Cfg: setting.NewCfg(),
Features: featuremgmt.WithFeatures(),
accesscontrolService: actest.FakeService{},
}
sc := setupScenarioContext(t, url)

View File

@@ -41,6 +41,12 @@ func (hs *HTTPServer) CreateTeam(c *models.ReqContext) response.Response {
return response.Error(500, "Failed to create Team", err)
}
// Clear permission cache for the user who's created the team, so that new permissions are fetched for their next call
// Required for cases when caller wants to immediately interact with the newly created object
if !hs.AccessControl.IsDisabled() {
hs.accesscontrolService.ClearUserPermissionCache(c.SignedInUser)
}
if accessControlEnabled || (c.OrgRole == org.RoleEditor && hs.Cfg.EditorsCanAdmin) {
// if the request is authenticated using API tokens
// the SignedInUser is an empty struct therefore

View File

@@ -143,11 +143,12 @@ func (s *Service) buildGraph(req *Request) (*simple.DirectedGraph, error) {
}
rn := &rawNode{
Query: rawQueryProp,
RefID: query.RefID,
TimeRange: query.TimeRange,
QueryType: query.QueryType,
DataSource: query.DataSource,
Query: rawQueryProp,
RefID: query.RefID,
TimeRange: query.TimeRange,
QueryType: query.QueryType,
DataSource: query.DataSource,
QueryEnricher: query.QueryEnricher,
}
var node Node

View File

@@ -42,11 +42,12 @@ type baseNode struct {
}
type rawNode struct {
RefID string `json:"refId"`
Query map[string]interface{}
QueryType string
TimeRange TimeRange
DataSource *datasources.DataSource
RefID string `json:"refId"`
Query map[string]interface{}
QueryType string
TimeRange TimeRange
DataSource *datasources.DataSource
QueryEnricher QueryDataRequestEnricher
}
func (rn *rawNode) GetCommandType() (c CommandType, err error) {
@@ -137,8 +138,9 @@ const (
// DSNode is a DPNode that holds a datasource request.
type DSNode struct {
baseNode
query json.RawMessage
datasource *datasources.DataSource
query json.RawMessage
datasource *datasources.DataSource
queryEnricher QueryDataRequestEnricher
orgID int64
queryType string
@@ -164,14 +166,15 @@ func (s *Service) buildDSNode(dp *simple.DirectedGraph, rn *rawNode, req *Reques
id: dp.NewNode().ID(),
refID: rn.RefID,
},
orgID: req.OrgId,
query: json.RawMessage(encodedQuery),
queryType: rn.QueryType,
intervalMS: defaultIntervalMS,
maxDP: defaultMaxDP,
timeRange: rn.TimeRange,
request: *req,
datasource: rn.DataSource,
orgID: req.OrgId,
query: json.RawMessage(encodedQuery),
queryType: rn.QueryType,
intervalMS: defaultIntervalMS,
maxDP: defaultMaxDP,
timeRange: rn.TimeRange,
request: *req,
datasource: rn.DataSource,
queryEnricher: rn.QueryEnricher,
}
var floatIntervalMS float64
@@ -205,27 +208,32 @@ func (dn *DSNode) Execute(ctx context.Context, vars mathexp.Vars, s *Service) (m
OrgID: dn.orgID,
DataSourceInstanceSettings: dsInstanceSettings,
PluginID: dn.datasource.Type,
User: dn.request.User,
}
q := []backend.DataQuery{
{
RefID: dn.refID,
MaxDataPoints: dn.maxDP,
Interval: time.Duration(int64(time.Millisecond) * dn.intervalMS),
JSON: dn.query,
TimeRange: backend.TimeRange{
From: dn.timeRange.From,
To: dn.timeRange.To,
},
QueryType: dn.queryType,
},
}
resp, err := s.dataService.QueryData(ctx, &backend.QueryDataRequest{
req := &backend.QueryDataRequest{
PluginContext: pc,
Queries: q,
Headers: dn.request.Headers,
})
Queries: []backend.DataQuery{
{
RefID: dn.refID,
MaxDataPoints: dn.maxDP,
Interval: time.Duration(int64(time.Millisecond) * dn.intervalMS),
JSON: dn.query,
TimeRange: backend.TimeRange{
From: dn.timeRange.From,
To: dn.timeRange.To,
},
QueryType: dn.queryType,
},
},
Headers: dn.request.Headers,
}
if dn.queryEnricher != nil {
ctx = dn.queryEnricher(ctx, req)
}
resp, err := s.dataService.QueryData(ctx, req)
if err != nil {
return mathexp.Results{}, err
}

View File

@@ -35,14 +35,19 @@ type Request struct {
Debug bool
OrgId int64
Queries []Query
User *backend.User
}
// QueryDataRequestEnricher function definition for enriching a backend.QueryDataRequest request.
type QueryDataRequestEnricher func(ctx context.Context, req *backend.QueryDataRequest) context.Context
// Query is like plugins.DataSubQuery, but with a a time range, and only the UID
// for the data source. Also interval is a time.Duration.
type Query struct {
RefID string
TimeRange TimeRange
DataSource *datasources.DataSource `json:"datasource"`
QueryEnricher QueryDataRequestEnricher
JSON json.RawMessage
Interval time.Duration
QueryType string

View File

@@ -26,6 +26,8 @@ type Service interface {
registry.ProvidesUsageStats
// GetUserPermissions returns user permissions with only action and scope fields set.
GetUserPermissions(ctx context.Context, user *user.SignedInUser, options Options) ([]Permission, error)
// ClearUserPermissionCache removes the permission cache entry for the given user
ClearUserPermissionCache(user *user.SignedInUser)
// DeleteUserPermissions removes all permissions user has in org and all permission to that user
// If orgID is set to 0 remove permissions from all orgs
DeleteUserPermissions(ctx context.Context, orgID, userID int64) error

View File

@@ -138,6 +138,14 @@ func (s *Service) getCachedUserPermissions(ctx context.Context, user *user.Signe
return permissions, nil
}
func (s *Service) ClearUserPermissionCache(user *user.SignedInUser) {
key, err := permissionCacheKey(user)
if err != nil {
return
}
s.cache.Delete(key)
}
func (s *Service) DeleteUserPermissions(ctx context.Context, orgID int64, userID int64) error {
return s.store.DeleteUserPermissions(ctx, orgID, userID)
}

View File

@@ -24,6 +24,8 @@ func (f FakeService) GetUserPermissions(ctx context.Context, user *user.SignedIn
return f.ExpectedPermissions, f.ExpectedErr
}
func (f FakeService) ClearUserPermissionCache(user *user.SignedInUser) {}
func (f FakeService) DeleteUserPermissions(ctx context.Context, orgID, userID int64) error {
return f.ExpectedErr
}

View File

@@ -18,6 +18,7 @@ type fullAccessControl interface {
type Calls struct {
Evaluate []interface{}
GetUserPermissions []interface{}
ClearUserPermissionCache []interface{}
IsDisabled []interface{}
DeclareFixedRoles []interface{}
GetUserBuiltInRoles []interface{}
@@ -40,6 +41,7 @@ type Mock struct {
// Override functions
EvaluateFunc func(context.Context, *user.SignedInUser, accesscontrol.Evaluator) (bool, error)
GetUserPermissionsFunc func(context.Context, *user.SignedInUser, accesscontrol.Options) ([]accesscontrol.Permission, error)
ClearUserPermissionCacheFunc func(*user.SignedInUser)
IsDisabledFunc func() bool
DeclareFixedRolesFunc func(...accesscontrol.RoleRegistration) error
GetUserBuiltInRolesFunc func(user *user.SignedInUser) []string
@@ -133,6 +135,14 @@ func (m *Mock) GetUserPermissions(ctx context.Context, user *user.SignedInUser,
return m.permissions, nil
}
func (m *Mock) ClearUserPermissionCache(user *user.SignedInUser) {
m.Calls.ClearUserPermissionCache = append(m.Calls.ClearUserPermissionCache, []interface{}{user})
// Use override if provided
if m.ClearUserPermissionCacheFunc != nil {
m.ClearUserPermissionCacheFunc(user)
}
}
// Middleware checks if service disabled or not to switch to fallback authorization.
// This mock return m.disabled unless an override is provided.
func (m *Mock) IsDisabled() bool {

View File

@@ -220,6 +220,10 @@ func (s *ServiceImpl) GetNavTree(c *models.ReqContext, hasEditPerm bool, prefs *
navTree = s.addHelpLinks(navTree, c)
if len(navTree) < 1 {
navTree = make([]*navtree.NavLink, 0)
}
return navTree, nil
}

View File

@@ -132,7 +132,7 @@ func buildQueryDataService(t *testing.T, cs datasources.CacheService, fpc *fakeP
}
return query.ProvideService(
nil,
setting.NewCfg(),
cs,
nil,
&fakePluginRequestValidator{},

View File

@@ -3,7 +3,6 @@ package query
import (
"context"
"fmt"
"net/http"
"time"
"github.com/grafana/grafana/pkg/api/dtos"
@@ -112,10 +111,17 @@ func (s *Service) QueryDataMultipleSources(ctx context.Context, user *user.Signe
// handleExpressions handles POST /api/ds/query when there is an expression.
func (s *Service) handleExpressions(ctx context.Context, user *user.SignedInUser, parsedReq *parsedRequest) (*backend.QueryDataResponse, error) {
exprReq := expr.Request{
OrgId: user.OrgID,
Queries: []expr.Query{},
}
if user != nil { // for passthrough authentication, SSE does not authenticate
exprReq.User = adapters.BackendUserFromSignedInUser(user)
exprReq.OrgId = user.OrgID
}
disallowedCookies := []string{s.cfg.LoginCookieName}
queryEnrichers := parsedReq.createDataSourceQueryEnrichers(ctx, user, s.oAuthTokenService, disallowedCookies)
for _, pq := range parsedReq.parsedQueries {
if pq.datasource == nil {
return nil, ErrMissingDataSourceInfo.Build(errutil.TemplateData{
@@ -136,6 +142,7 @@ func (s *Service) handleExpressions(ctx context.Context, user *user.SignedInUser
From: pq.query.TimeRange.From,
To: pq.query.TimeRange.To,
},
QueryEnricher: queryEnrichers[pq.datasource.Uid],
})
}
@@ -168,10 +175,11 @@ func (s *Service) handleQueryData(ctx context.Context, user *user.SignedInUser,
Queries: []backend.DataQuery{},
}
disallowedCookies := []string{s.cfg.LoginCookieName}
middlewares := []httpclient.Middleware{}
if parsedReq.httpRequest != nil {
middlewares = append(middlewares,
httpclientprovider.ForwardedCookiesMiddleware(parsedReq.httpRequest.Cookies(), ds.AllowedCookies(), []string{s.cfg.LoginCookieName}),
httpclientprovider.ForwardedCookiesMiddleware(parsedReq.httpRequest.Cookies(), ds.AllowedCookies(), disallowedCookies),
)
}
@@ -188,7 +196,7 @@ func (s *Service) handleQueryData(ctx context.Context, user *user.SignedInUser,
}
if parsedReq.httpRequest != nil {
proxyutil.ClearCookieHeader(parsedReq.httpRequest, ds.AllowedCookies(), []string{s.cfg.LoginCookieName})
proxyutil.ClearCookieHeader(parsedReq.httpRequest, ds.AllowedCookies(), disallowedCookies)
if cookieStr := parsedReq.httpRequest.Header.Get("Cookie"); cookieStr != "" {
req.Headers["Cookie"] = cookieStr
}
@@ -203,17 +211,7 @@ func (s *Service) handleQueryData(ctx context.Context, user *user.SignedInUser,
return s.pluginClient.QueryData(ctx, req)
}
type parsedQuery struct {
datasource *datasources.DataSource
query backend.DataQuery
}
type parsedRequest struct {
hasExpression bool
parsedQueries []parsedQuery
httpRequest *http.Request
}
// parseRequest parses a request into parsed queries grouped by datasource uid
func (s *Service) parseMetricRequest(ctx context.Context, user *user.SignedInUser, skipCache bool, reqDTO dtos.MetricRequest) (*parsedRequest, error) {
if len(reqDTO.Queries) == 0 {
return nil, ErrNoQueriesFound

View File

@@ -0,0 +1,91 @@
package query
import (
"context"
"fmt"
"net/http"
"github.com/grafana/grafana-plugin-sdk-go/backend"
"github.com/grafana/grafana-plugin-sdk-go/backend/httpclient"
"github.com/grafana/grafana/pkg/expr"
"github.com/grafana/grafana/pkg/infra/httpclient/httpclientprovider"
"github.com/grafana/grafana/pkg/services/datasources"
"github.com/grafana/grafana/pkg/services/oauthtoken"
"github.com/grafana/grafana/pkg/services/user"
"github.com/grafana/grafana/pkg/util/proxyutil"
"golang.org/x/oauth2"
)
type parsedQuery struct {
datasource *datasources.DataSource
query backend.DataQuery
}
type parsedRequest struct {
hasExpression bool
parsedQueries []parsedQuery
httpRequest *http.Request
}
func (pr parsedRequest) createDataSourceQueryEnrichers(ctx context.Context, signedInUser *user.SignedInUser, oAuthTokenService oauthtoken.OAuthTokenService, disallowedCookies []string) map[string]expr.QueryDataRequestEnricher {
datasourcesHeaderProvider := map[string]expr.QueryDataRequestEnricher{}
if pr.httpRequest == nil {
return datasourcesHeaderProvider
}
if len(pr.parsedQueries) == 0 || pr.parsedQueries[0].datasource == nil {
return datasourcesHeaderProvider
}
for _, q := range pr.parsedQueries {
ds := q.datasource
uid := ds.Uid
if expr.IsDataSource(uid) {
continue
}
if _, exists := datasourcesHeaderProvider[uid]; exists {
continue
}
allowedCookies := ds.AllowedCookies()
clonedReq := pr.httpRequest.Clone(pr.httpRequest.Context())
var token *oauth2.Token
if oAuthTokenService.IsOAuthPassThruEnabled(ds) {
token = oAuthTokenService.GetCurrentOAuthToken(ctx, signedInUser)
}
datasourcesHeaderProvider[uid] = func(ctx context.Context, req *backend.QueryDataRequest) context.Context {
if len(req.Headers) == 0 {
req.Headers = map[string]string{}
}
if len(allowedCookies) > 0 {
proxyutil.ClearCookieHeader(clonedReq, allowedCookies, disallowedCookies)
if cookieStr := clonedReq.Header.Get("Cookie"); cookieStr != "" {
req.Headers["Cookie"] = cookieStr
}
ctx = httpclient.WithContextualMiddleware(ctx, httpclientprovider.ForwardedCookiesMiddleware(clonedReq.Cookies(), allowedCookies, disallowedCookies))
}
if token != nil {
req.Headers["Authorization"] = fmt.Sprintf("%s %s", token.Type(), token.AccessToken)
idToken, ok := token.Extra("id_token").(string)
if ok && idToken != "" {
req.Headers["X-ID-Token"] = idToken
}
ctx = httpclient.WithContextualMiddleware(ctx, httpclientprovider.ForwardedOAuthIdentityMiddleware(token))
}
return ctx
}
}
return datasourcesHeaderProvider
}

View File

@@ -1,18 +1,18 @@
package query_test
package query
import (
"context"
"errors"
"net/http"
"net/http/httptest"
"testing"
"github.com/grafana/grafana-plugin-sdk-go/backend"
"github.com/grafana/grafana/pkg/expr"
"github.com/stretchr/testify/require"
"golang.org/x/oauth2"
"github.com/grafana/grafana-plugin-sdk-go/backend/httpclient"
"github.com/grafana/grafana/pkg/api/dtos"
"github.com/grafana/grafana/pkg/components/simplejson"
"github.com/grafana/grafana/pkg/expr"
"github.com/grafana/grafana/pkg/infra/httpclient/httpclientprovider"
"github.com/grafana/grafana/pkg/infra/log"
"github.com/grafana/grafana/pkg/plugins"
acmock "github.com/grafana/grafana/pkg/services/accesscontrol/mock"
@@ -20,15 +20,243 @@ import (
fakeDatasources "github.com/grafana/grafana/pkg/services/datasources/fakes"
dsSvc "github.com/grafana/grafana/pkg/services/datasources/service"
"github.com/grafana/grafana/pkg/services/featuremgmt"
"github.com/grafana/grafana/pkg/services/query"
"github.com/grafana/grafana/pkg/services/secrets/fakes"
secretskvs "github.com/grafana/grafana/pkg/services/secrets/kvstore"
secretsmng "github.com/grafana/grafana/pkg/services/secrets/manager"
"github.com/grafana/grafana/pkg/services/sqlstore"
"github.com/grafana/grafana/pkg/services/user"
"github.com/grafana/grafana/pkg/setting"
"github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require"
"golang.org/x/oauth2"
)
func TestParseMetricRequest(t *testing.T) {
t.Run("Test a simple single datasource query", func(t *testing.T) {
tc := setup(t)
json, err := simplejson.NewJson([]byte(`{
"keepCookies": [ "cookie1", "cookie3", "login" ]
}`))
require.NoError(t, err)
tc.dataSourceCache.dsByUid = func(ctx context.Context, datasourceUID string, user *user.SignedInUser, skipCache bool) (*datasources.DataSource, error) {
if datasourceUID == "gIEkMvIVz" {
return &datasources.DataSource{
Uid: "gIEkMvIVz",
JsonData: json,
}, nil
}
return nil, nil
}
token := &oauth2.Token{
TokenType: "bearer",
AccessToken: "access-token",
}
token = token.WithExtra(map[string]interface{}{"id_token": "id-token"})
tc.oauthTokenService.passThruEnabled = true
tc.oauthTokenService.token = token
mr := metricRequestWithQueries(t, `{
"refId": "A",
"datasource": {
"uid": "gIEkMvIVz",
"type": "postgres"
}
}`, `{
"refId": "B",
"datasource": {
"uid": "gIEkMvIVz",
"type": "postgres"
}
}`)
parsedReq, err := tc.queryService.parseMetricRequest(context.Background(), tc.signedInUser, true, mr)
require.NoError(t, err)
require.NotNil(t, parsedReq)
assert.False(t, parsedReq.hasExpression)
assert.Len(t, parsedReq.parsedQueries, 2)
assert.Equal(t, "gIEkMvIVz", parsedReq.parsedQueries[0].datasource.Uid)
assert.Equal(t, "gIEkMvIVz", parsedReq.parsedQueries[1].datasource.Uid)
t.Run("createDataSourceQueryEnrichers should return 0 enrichers when no HTTP request", func(t *testing.T) {
enrichers := parsedReq.createDataSourceQueryEnrichers(context.Background(), nil, tc.oauthTokenService, []string{})
require.Empty(t, enrichers)
})
t.Run("createDataSourceQueryEnrichers should return 1 enricher", func(t *testing.T) {
parsedReq.httpRequest = httptest.NewRequest(http.MethodGet, "/", nil)
parsedReq.httpRequest.AddCookie(&http.Cookie{Name: "cookie1"})
parsedReq.httpRequest.AddCookie(&http.Cookie{Name: "cookie2"})
parsedReq.httpRequest.AddCookie(&http.Cookie{Name: "cookie3"})
parsedReq.httpRequest.AddCookie(&http.Cookie{Name: "login"})
enrichers := parsedReq.createDataSourceQueryEnrichers(context.Background(), nil, tc.oauthTokenService, []string{"login"})
require.Len(t, enrichers, 1)
require.NotNil(t, enrichers["gIEkMvIVz"])
req := &backend.QueryDataRequest{}
ctx := enrichers["gIEkMvIVz"](context.Background(), req)
require.Len(t, req.Headers, 3)
require.Equal(t, "Bearer access-token", req.Headers["Authorization"])
require.Equal(t, "id-token", req.Headers["X-ID-Token"])
require.Equal(t, "cookie1=; cookie3=", req.Headers["Cookie"])
middlewares := httpclient.ContextualMiddlewareFromContext(ctx)
require.Len(t, middlewares, 2)
require.Equal(t, httpclientprovider.ForwardedCookiesMiddlewareName, middlewares[0].(httpclient.MiddlewareName).MiddlewareName())
require.Equal(t, httpclientprovider.ForwardedOAuthIdentityMiddlewareName, middlewares[1].(httpclient.MiddlewareName).MiddlewareName())
})
})
t.Run("Test a single datasource query with expressions", func(t *testing.T) {
tc := setup(t)
json, err := simplejson.NewJson([]byte(`{
"keepCookies": [ "cookie1", "cookie3", "login" ]
}`))
require.NoError(t, err)
tc.dataSourceCache.dsByUid = func(ctx context.Context, datasourceUID string, user *user.SignedInUser, skipCache bool) (*datasources.DataSource, error) {
if datasourceUID == "gIEkMvIVz" {
return &datasources.DataSource{
Uid: "gIEkMvIVz",
JsonData: json,
}, nil
}
return nil, nil
}
token := &oauth2.Token{
TokenType: "bearer",
AccessToken: "access-token",
}
token = token.WithExtra(map[string]interface{}{"id_token": "id-token"})
tc.oauthTokenService.passThruEnabled = true
tc.oauthTokenService.token = token
mr := metricRequestWithQueries(t, `{
"refId": "A",
"datasource": {
"uid": "gIEkMvIVz",
"type": "postgres"
}
}`, `{
"refId": "B",
"datasource": {
"type": "__expr__",
"uid": "__expr__",
"name": "Expression"
},
"type": "math",
"expression": "$A - 50"
}`)
parsedReq, err := tc.queryService.parseMetricRequest(context.Background(), tc.signedInUser, true, mr)
require.NoError(t, err)
require.NotNil(t, parsedReq)
assert.True(t, parsedReq.hasExpression)
assert.Len(t, parsedReq.parsedQueries, 2)
assert.Equal(t, "gIEkMvIVz", parsedReq.parsedQueries[0].datasource.Uid)
assert.Equal(t, expr.DatasourceUID, parsedReq.parsedQueries[1].datasource.Uid)
// Make sure we end up with something valid
_, err = tc.queryService.handleExpressions(context.Background(), tc.signedInUser, parsedReq)
assert.NoError(t, err)
t.Run("createDataSourceQueryEnrichers should return 1 enricher", func(t *testing.T) {
parsedReq.httpRequest = httptest.NewRequest(http.MethodGet, "/", nil)
parsedReq.httpRequest.AddCookie(&http.Cookie{Name: "cookie1"})
parsedReq.httpRequest.AddCookie(&http.Cookie{Name: "cookie2"})
parsedReq.httpRequest.AddCookie(&http.Cookie{Name: "cookie3"})
parsedReq.httpRequest.AddCookie(&http.Cookie{Name: "login"})
enrichers := parsedReq.createDataSourceQueryEnrichers(context.Background(), nil, tc.oauthTokenService, []string{"login"})
require.Len(t, enrichers, 1)
require.NotNil(t, enrichers["gIEkMvIVz"])
req := &backend.QueryDataRequest{}
ctx := enrichers["gIEkMvIVz"](context.Background(), req)
require.Len(t, req.Headers, 3)
require.Equal(t, "Bearer access-token", req.Headers["Authorization"])
require.Equal(t, "id-token", req.Headers["X-ID-Token"])
require.Equal(t, "cookie1=; cookie3=", req.Headers["Cookie"])
middlewares := httpclient.ContextualMiddlewareFromContext(ctx)
require.Len(t, middlewares, 2)
require.Equal(t, httpclientprovider.ForwardedCookiesMiddlewareName, middlewares[0].(httpclient.MiddlewareName).MiddlewareName())
require.Equal(t, httpclientprovider.ForwardedOAuthIdentityMiddlewareName, middlewares[1].(httpclient.MiddlewareName).MiddlewareName())
})
})
t.Run("Test a mixed datasource query with expressions", func(t *testing.T) {
tc := setup(t)
mr := metricRequestWithQueries(t, `{
"refId": "A",
"datasource": {
"uid": "gIEkMvIVz",
"type": "postgres"
}
}`, `{
"refId": "B",
"datasource": {
"uid": "sEx6ZvSVk",
"type": "testdata"
}
}`, `{
"refId": "A_resample",
"datasource": {
"type": "__expr__",
"uid": "__expr__",
"name": "Expression"
},
"expression": "A",
"type": "resample",
"downsampler": "mean",
"upsampler": "fillna",
"window": "10s"
}`, `{
"refId": "B_resample",
"datasource": {
"type": "__expr__",
"uid": "__expr__",
"name": "Expression"
},
"expression": "B",
"type": "resample",
"downsampler": "mean",
"upsampler": "fillna",
"window": "10s"
}`, `{
"refId": "C",
"datasource": {
"type": "__expr__",
"uid": "__expr__",
"name": "Expression"
},
"type": "math",
"expression": "$A_resample + $B_resample"
}`)
parsedReq, err := tc.queryService.parseMetricRequest(context.Background(), tc.signedInUser, true, mr)
require.NoError(t, err)
require.NotNil(t, parsedReq)
assert.True(t, parsedReq.hasExpression)
assert.Len(t, parsedReq.parsedQueries, 5)
assert.Equal(t, "gIEkMvIVz", parsedReq.parsedQueries[0].datasource.Uid)
assert.Equal(t, "sEx6ZvSVk", parsedReq.parsedQueries[1].datasource.Uid)
assert.Equal(t, expr.DatasourceUID, parsedReq.parsedQueries[2].datasource.Uid)
assert.Equal(t, expr.DatasourceUID, parsedReq.parsedQueries[3].datasource.Uid)
assert.Equal(t, expr.DatasourceUID, parsedReq.parsedQueries[4].datasource.Uid)
// Make sure we end up with something valid
_, err = tc.queryService.handleExpressions(context.Background(), tc.signedInUser, parsedReq)
assert.NoError(t, err)
t.Run("createDataSourceQueryEnrichers should return 2 enrichers", func(t *testing.T) {
parsedReq.httpRequest = &http.Request{}
enrichers := parsedReq.createDataSourceQueryEnrichers(context.Background(), nil, tc.oauthTokenService, []string{})
require.Len(t, enrichers, 2)
require.NotNil(t, enrichers["gIEkMvIVz"])
require.NotNil(t, enrichers["sEx6ZvSVk"])
})
})
}
func TestQueryDataMultipleSources(t *testing.T) {
t.Run("can query multiple datasources", func(t *testing.T) {
tc := setup(t)
@@ -127,7 +355,12 @@ func TestQueryData(t *testing.T) {
tc.oauthTokenService.passThruEnabled = true
tc.oauthTokenService.token = token
_, err := tc.queryService.QueryData(context.Background(), nil, true, metricRequest(), false)
metricReq := metricRequest()
httpReq, err := http.NewRequest(http.MethodGet, "/", nil)
require.NoError(t, err)
metricReq.HTTPRequest = httpReq
_, err = tc.queryService.QueryData(context.Background(), nil, true, metricReq, false)
require.Nil(t, err)
expected := map[string]string{
@@ -190,7 +423,9 @@ func setup(t *testing.T) *testContext {
DataSources: nil,
SimulatePluginFailure: false,
}
exprService := expr.ProvideService(nil, pc, fakeDatasourceService)
cfg := setting.NewCfg()
cfg.ExpressionsEnabled = true
exprService := expr.ProvideService(cfg, pc, fakeDatasourceService)
return &testContext{
pluginContext: pc,
@@ -198,7 +433,8 @@ func setup(t *testing.T) *testContext {
dataSourceCache: dc,
oauthTokenService: tc,
pluginRequestValidator: rv,
queryService: query.ProvideService(setting.NewCfg(), dc, exprService, rv, ds, pc, tc),
queryService: ProvideService(setting.NewCfg(), dc, exprService, rv, ds, pc, tc),
signedInUser: &user.SignedInUser{OrgID: 1},
}
}
@@ -208,7 +444,8 @@ type testContext struct {
dataSourceCache *fakeDataSourceCache
oauthTokenService *fakeOAuthTokenService
pluginRequestValidator *fakePluginRequestValidator
queryService *query.Service
queryService *Service
signedInUser *user.SignedInUser
}
func metricRequest() dtos.MetricRequest {
@@ -221,6 +458,22 @@ func metricRequest() dtos.MetricRequest {
}
}
func metricRequestWithQueries(t *testing.T, rawQueries ...string) dtos.MetricRequest {
t.Helper()
queries := make([]*simplejson.Json, 0)
for _, q := range rawQueries {
json, err := simplejson.NewJson([]byte(q))
require.NoError(t, err)
queries = append(queries, json)
}
return dtos.MetricRequest{
From: "now-1h",
To: "now",
Queries: queries,
Debug: false,
}
}
type fakePluginRequestValidator struct {
err error
}
@@ -243,7 +496,8 @@ func (ts *fakeOAuthTokenService) IsOAuthPassThruEnabled(*datasources.DataSource)
}
type fakeDataSourceCache struct {
ds *datasources.DataSource
ds *datasources.DataSource
dsByUid func(ctx context.Context, datasourceUID string, user *user.SignedInUser, skipCache bool) (*datasources.DataSource, error)
}
func (c *fakeDataSourceCache) GetDatasource(ctx context.Context, datasourceID int64, user *user.SignedInUser, skipCache bool) (*datasources.DataSource, error) {
@@ -251,7 +505,13 @@ func (c *fakeDataSourceCache) GetDatasource(ctx context.Context, datasourceID in
}
func (c *fakeDataSourceCache) GetDatasourceByUID(ctx context.Context, datasourceUID string, user *user.SignedInUser, skipCache bool) (*datasources.DataSource, error) {
return c.ds, nil
if c.dsByUid != nil {
return c.dsByUid(ctx, datasourceUID, user, skipCache)
}
return &datasources.DataSource{
Uid: datasourceUID,
}, nil
}
type fakePluginClient struct {

View File

@@ -21,31 +21,34 @@ import (
)
type ServiceAccountsAPI struct {
cfg *setting.Cfg
service serviceaccounts.Service
accesscontrol accesscontrol.AccessControl
RouterRegister routing.RouteRegister
store serviceaccounts.Store
log log.Logger
permissionService accesscontrol.ServiceAccountPermissionsService
cfg *setting.Cfg
service serviceaccounts.Service
accesscontrol accesscontrol.AccessControl
accesscontrolService accesscontrol.Service
RouterRegister routing.RouteRegister
store serviceaccounts.Store
log log.Logger
permissionService accesscontrol.ServiceAccountPermissionsService
}
func NewServiceAccountsAPI(
cfg *setting.Cfg,
service serviceaccounts.Service,
accesscontrol accesscontrol.AccessControl,
accesscontrolService accesscontrol.Service,
routerRegister routing.RouteRegister,
store serviceaccounts.Store,
permissionService accesscontrol.ServiceAccountPermissionsService,
) *ServiceAccountsAPI {
return &ServiceAccountsAPI{
cfg: cfg,
service: service,
accesscontrol: accesscontrol,
RouterRegister: routerRegister,
store: store,
log: log.New("serviceaccounts.api"),
permissionService: permissionService,
cfg: cfg,
service: service,
accesscontrol: accesscontrol,
accesscontrolService: accesscontrolService,
RouterRegister: routerRegister,
store: store,
log: log.New("serviceaccounts.api"),
permissionService: permissionService,
}
}
@@ -127,6 +130,10 @@ func (api *ServiceAccountsAPI) CreateServiceAccount(c *models.ReqContext) respon
return response.Error(http.StatusInternalServerError, "Failed to set permissions for service account creator", err)
}
}
// Clear permission cache for the user who's created the service account, so that new permissions are fetched for their next call
// Required for cases when caller wants to immediately interact with the newly created object
api.accesscontrolService.ClearUserPermissionCache(c.SignedInUser)
}
return response.JSON(http.StatusCreated, serviceAccount)

View File

@@ -16,6 +16,7 @@ import (
"github.com/grafana/grafana/pkg/infra/log"
"github.com/grafana/grafana/pkg/models"
"github.com/grafana/grafana/pkg/services/accesscontrol"
"github.com/grafana/grafana/pkg/services/accesscontrol/actest"
accesscontrolmock "github.com/grafana/grafana/pkg/services/accesscontrol/mock"
"github.com/grafana/grafana/pkg/services/accesscontrol/ossaccesscontrol"
"github.com/grafana/grafana/pkg/services/apikey/apikeyimpl"
@@ -284,8 +285,9 @@ func setupTestServer(t *testing.T, svc *tests.ServiceAccountMock,
teamSvc := teamimpl.ProvideService(sqlStore, cfg)
saPermissionService, err := ossaccesscontrol.ProvideServiceAccountPermissions(cfg, routing.NewRouteRegister(), sqlStore, acmock, &licensing.OSSLicensingService{}, saStore, acmock, teamSvc)
require.NoError(t, err)
acService := actest.FakeService{}
a := NewServiceAccountsAPI(cfg, svc, acmock, routerRegister, saStore, saPermissionService)
a := NewServiceAccountsAPI(cfg, svc, acmock, acService, routerRegister, saStore, saPermissionService)
a.RegisterAPIEndpoints()
a.cfg.ApiKeyMaxSecondsToLive = -1 // disable api key expiration

View File

@@ -45,7 +45,7 @@ func ProvideServiceAccountsService(
usageStats.RegisterMetricsFunc(s.getUsageMetrics)
serviceaccountsAPI := api.NewServiceAccountsAPI(cfg, s, ac, routeRegister, s.store, permissionService)
serviceaccountsAPI := api.NewServiceAccountsAPI(cfg, s, ac, accesscontrolService, routeRegister, s.store, permissionService)
serviceaccountsAPI.RegisterAPIEndpoints()
return s, nil

View File

@@ -191,7 +191,10 @@ func (e *AzureLogAnalyticsDatasource) executeQuery(ctx context.Context, query *A
if err != nil {
return dataResponseErrorWithExecuted(err)
}
appendErrorNotice(frame, logResponse.Error)
frame = appendErrorNotice(frame, logResponse.Error)
if frame == nil {
return dataResponse
}
model, err := simplejson.NewJson(query.JSON)
if err != nil {
@@ -223,10 +226,15 @@ func (e *AzureLogAnalyticsDatasource) executeQuery(ctx context.Context, query *A
return dataResponse
}
func appendErrorNotice(frame *data.Frame, err *AzureLogAnalyticsAPIError) {
if err != nil {
frame.AppendNotices(apiErrorToNotice(err))
func appendErrorNotice(frame *data.Frame, err *AzureLogAnalyticsAPIError) *data.Frame {
if err == nil {
return frame
}
if frame == nil {
frame = &data.Frame{}
}
frame.AppendNotices(apiErrorToNotice(err))
return frame
}
func (e *AzureLogAnalyticsDatasource) createRequest(ctx context.Context, dsInfo types.DatasourceInfo, url string) (*http.Request, error) {

View File

@@ -45,12 +45,7 @@ func apiErrorToNotice(err *AzureLogAnalyticsAPIError) data.Notice {
// ResponseTableToFrame converts an AzureResponseTable to a data.Frame.
func ResponseTableToFrame(table *types.AzureResponseTable, refID string, executedQuery string) (*data.Frame, error) {
if len(table.Rows) == 0 {
return &data.Frame{
RefID: refID,
Meta: &data.FrameMeta{
ExecutedQueryString: executedQuery,
},
}, nil
return nil, nil
}
converterFrame, err := converterFrameForTable(table)

View File

@@ -173,12 +173,7 @@ func TestLogTableToFrame(t *testing.T) {
name: "empty data response",
testFile: "loganalytics/11-log-analytics-response-empty.json",
expectedFrame: func() *data.Frame {
return &data.Frame{
RefID: "A",
Meta: &data.FrameMeta{
ExecutedQueryString: "query",
},
}
return nil
},
},
}

View File

@@ -5,6 +5,7 @@ import (
"encoding/json"
"fmt"
"net/http"
"net/textproto"
"regexp"
"strings"
"sync"
@@ -122,18 +123,22 @@ func (s *Service) CallResource(ctx context.Context, req *backend.CallResourceReq
func getAuthHeadersForCallResource(headers map[string][]string) map[string]string {
data := make(map[string]string)
if auth := arrayHeaderFirstValue(headers["Authorization"]); auth != "" {
data["Authorization"] = auth
}
for k, values := range headers {
k = textproto.CanonicalMIMEHeaderKey(k)
firstValue := arrayHeaderFirstValue(values)
if cookie := arrayHeaderFirstValue(headers["Cookie"]); cookie != "" {
data["Cookie"] = cookie
if firstValue == "" {
continue
}
switch k {
case "Authorization":
data["Authorization"] = firstValue
case "X-Id-Token":
data["X-ID-Token"] = firstValue
case "Cookie":
data["Cookie"] = firstValue
}
}
if idToken := arrayHeaderFirstValue(headers["X-ID-Token"]); idToken != "" {
data["X-ID-Token"] = idToken
}
return data
}

View File

@@ -0,0 +1,72 @@
package loki
import (
"testing"
"github.com/stretchr/testify/assert"
)
func TestGetHeadersForCallResource(t *testing.T) {
const idTokn1 = "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6IkpvaG4gRG9lIiwiaWF0IjoxNTE2MjM5MDIyfQ.SflKxwRJSMeKKF2QT4fwpMeJf36POk6yJV_adQssw5c"
const idTokn2 = "eyJhbGciOiJIUzI1NiJ9.eyJuYW1lIjoiSm9obiBEb2UiLCJleHAiOjE2Njg2MjExODQsImlhdCI6MTY2ODYyMTE4NH0.bg0Y0S245DeANhNnnLBCfGYBseTld29O0xynhQwZZlU"
const authTokn1 = "Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6IkpvaG4gRG9lIiwiaWF0IjoxNTE2MjM5MDIyfQ.SflKxwRJSMeKKF2QT4fwpMeJf36POk6yJV_adQssw5c"
const authTokn2 = "Bearer eyJhbGciOiJIUzI1NiJ9.eyJuYW1lIjoiSm9obiBEb2UiLCJleHAiOjE2Njg2MjExODQsImlhdCI6MTY2ODYyMTE4NH0.bg0Y0S245DeANhNnnLBCfGYBseTld29O0xynhQwZZlU"
testCases := map[string]struct {
headers map[string][]string
expectedHeaders map[string]string
}{
"Headers with empty value": {
headers: map[string][]string{
"X-Grafana-Org-Id": {"1"},
"Cookie": {""},
"X-Id-Token": {""},
"Accept-Encoding": {""},
"Authorization": {""},
},
expectedHeaders: map[string]string{},
},
"Headers with multiple values": {
headers: map[string][]string{
"Authorization": {authTokn1, authTokn2},
"Cookie": {"a=1"},
"X-Grafana-Org-Id": {"1"},
"Accept-Encoding": {"gzip", "compress"},
"X-Id-Token": {idTokn1, idTokn2},
},
expectedHeaders: map[string]string{
"Authorization": authTokn1,
"Cookie": "a=1",
"X-ID-Token": idTokn1,
},
},
"Headers with single value": {
headers: map[string][]string{
"Authorization": {authTokn1},
"X-Grafana-Org-Id": {"1"},
"Cookie": {"a=1"},
"Accept-Encoding": {"gzip"},
"X-Id-Token": {idTokn1},
},
expectedHeaders: map[string]string{
"Authorization": authTokn1,
"Cookie": "a=1",
"X-ID-Token": idTokn1,
},
},
"Non Canonical 'X-Id-Token' header key": {
headers: map[string][]string{
"X-ID-TOKEN": {idTokn1},
},
expectedHeaders: map[string]string{
"X-ID-Token": idTokn1,
},
},
}
for name, test := range testCases {
t.Run(name, func(t *testing.T) {
headers := getAuthHeadersForCallResource(test.headers)
assert.Equal(t, test.expectedHeaders, headers)
})
}
}

View File

@@ -1,6 +1,6 @@
{
"name": "@grafana-plugins/input-datasource",
"version": "9.2.6",
"version": "9.2.7",
"description": "Input Datasource",
"private": true,
"repository": {
@@ -15,15 +15,15 @@
},
"author": "Grafana Labs",
"devDependencies": {
"@grafana/toolkit": "9.2.6",
"@grafana/toolkit": "9.2.7",
"@types/jest": "26.0.15",
"@types/lodash": "4.14.149",
"@types/react": "17.0.30",
"lodash": "4.17.21"
},
"dependencies": {
"@grafana/data": "9.2.6",
"@grafana/ui": "9.2.6",
"@grafana/data": "9.2.7",
"@grafana/ui": "9.2.7",
"jquery": "3.5.1",
"react": "17.0.1",
"react-dom": "17.0.1",

View File

@@ -2,7 +2,7 @@ import { css } from '@emotion/css';
import React, { FC, Fragment, useState } from 'react';
import { useLocation } from 'react-router-dom';
import { GrafanaTheme2, urlUtil } from '@grafana/data';
import { GrafanaTheme2, textUtil, urlUtil } from '@grafana/data';
import { config } from '@grafana/runtime';
import { Button, ClipboardButton, ConfirmModal, HorizontalGroup, LinkButton, useStyles2 } from '@grafana/ui';
import { useAppNotification } from 'app/core/copy/appNotification';
@@ -102,7 +102,7 @@ export const RuleDetailsActionButtons: FC<Props> = ({ rule, rulesSource }) => {
variant="primary"
icon="book"
target="__blank"
href={rule.annotations[Annotation.runbookURL]}
href={textUtil.sanitizeUrl(rule.annotations[Annotation.runbookURL])}
>
View runbook
</LinkButton>

View File

@@ -1,9 +1,8 @@
import React, { useCallback, useEffect, useRef } from 'react';
import { SQLEditor } from '@grafana/experimental';
import { LanguageDefinition, SQLEditor } from '@grafana/experimental';
import { LanguageCompletionProvider, SQLQuery } from '../../types';
import { formatSQL } from '../../utils/formatSQL';
import { SQLQuery } from '../../types';
type Props = {
query: SQLQuery;
@@ -11,10 +10,10 @@ type Props = {
children?: (props: { formatQuery: () => void }) => React.ReactNode;
width?: number;
height?: number;
completionProvider: LanguageCompletionProvider;
editorLanguageDefinition: LanguageDefinition;
};
export function QueryEditorRaw({ children, onChange, query, width, height, completionProvider }: Props) {
export function QueryEditorRaw({ children, onChange, query, width, height, editorLanguageDefinition }: Props) {
// We need to pass query via ref to SQLEditor as onChange is executed via monacoEditor.onDidChangeModelContent callback, not onChange property
const queryRef = useRef<SQLQuery>(query);
useEffect(() => {
@@ -39,7 +38,7 @@ export function QueryEditorRaw({ children, onChange, query, width, height, compl
height={height}
query={query.rawSql!}
onChange={onRawQueryChange}
language={{ id: 'sql', completionProvider, formatter: formatSQL }}
language={editorLanguageDefinition}
>
{children}
</SQLEditor>

View File

@@ -25,12 +25,12 @@ export function RawEditor({ db, query, onChange, onRunQuery, onValidate, queryTo
const [toolboxRef, toolboxMeasure] = useMeasure<HTMLDivElement>();
const [editorRef, editorMeasure] = useMeasure<HTMLDivElement>();
const completionProvider = useMemo(() => db.getSqlCompletionProvider(), [db]);
const editorLanguageDefinition = useMemo(() => db.getEditorLanguageDefinition(), [db]);
const renderQueryEditor = (width?: number, height?: number) => {
return (
<QueryEditorRaw
completionProvider={completionProvider}
editorLanguageDefinition={editorLanguageDefinition}
query={query}
width={width}
height={height ? height - toolboxMeasure.height : undefined}

View File

@@ -1,8 +1,8 @@
import React from 'react';
import { useAsync } from 'react-use';
import { SelectableValue, toOption } from '@grafana/data';
import { COMMON_AGGREGATE_FNS } from '../../constants';
import { QueryWithDefaults } from '../../defaults';
import { DB, SQLQuery } from '../../types';
import { useSqlChange } from '../../utils/useSqlChange';
@@ -18,11 +18,7 @@ interface SQLSelectRowProps {
export function SQLSelectRow({ fields, query, onQueryChange, db }: SQLSelectRowProps) {
const { onSqlChange } = useSqlChange({ query, onQueryChange, db });
const functions = [...COMMON_AGGREGATE_FNS, ...(db.functions?.() || [])].map(toOption);
const state = useAsync(async () => {
const functions = await db.functions();
return functions.map((f) => toOption(f.name));
}, [db]);
return <SelectRow columns={fields} sql={query.sql!} functions={state.value} onSqlChange={onSqlChange} />;
return <SelectRow columns={fields} sql={query.sql!} functions={functions} onSqlChange={onSqlChange} />;
}

View File

@@ -1,115 +1,4 @@
import { OperatorType } from './types';
export const AGGREGATE_FNS = [
{
id: 'AVG',
name: 'AVG',
description: `AVG(
[DISTINCT]
expression
)
[OVER (...)]
Returns the average of non-NULL input values, or NaN if the input contains a NaN.`,
},
{
id: 'COUNT',
name: 'COUNT',
description: `COUNT(*) [OVER (...)]
Returns the number of rows in the input.
COUNT(
[DISTINCT]
expression
)
[OVER (...)]
Returns the number of rows with expression evaluated to any value other than NULL.
`,
},
{
id: 'MAX',
name: 'MAX',
description: `MAX(
expression
)
[OVER (...)]
Returns the maximum value of non-NULL expressions. Returns NULL if there are zero input rows or expression evaluates to NULL for all rows. Returns NaN if the input contains a NaN.
`,
},
{
id: 'MIN',
name: 'MIN',
description: `MIN(
expression
)
[OVER (...)]
Returns the minimum value of non-NULL expressions. Returns NULL if there are zero input rows or expression evaluates to NULL for all rows. Returns NaN if the input contains a NaN.
`,
},
{
id: 'SUM',
name: 'SUM',
description: `SUM(
[DISTINCT]
expression
)
[OVER (...)]
Returns the sum of non-null values.
If the expression is a floating point value, the sum is non-deterministic, which means you might receive a different result each time you use this function.
`,
},
];
export const OPERATORS = [
{ type: OperatorType.Comparison, id: 'LESS_THAN', operator: '<', description: 'Returns TRUE if X is less than Y.' },
{
type: OperatorType.Comparison,
id: 'LESS_THAN_EQUAL',
operator: '<=',
description: 'Returns TRUE if X is less than or equal to Y.',
},
{
type: OperatorType.Comparison,
id: 'GREATER_THAN',
operator: '>',
description: 'Returns TRUE if X is greater than Y.',
},
{
type: OperatorType.Comparison,
id: 'GREATER_THAN_EQUAL',
operator: '>=',
description: 'Returns TRUE if X is greater than or equal to Y.',
},
{ type: OperatorType.Comparison, id: 'EQUAL', operator: '=', description: 'Returns TRUE if X is equal to Y.' },
{
type: OperatorType.Comparison,
id: 'NOT_EQUAL',
operator: '!=',
description: 'Returns TRUE if X is not equal to Y.',
},
{
type: OperatorType.Comparison,
id: 'NOT_EQUAL_ALT',
operator: '<>',
description: 'Returns TRUE if X is not equal to Y.',
},
{
type: OperatorType.Comparison,
id: 'LIKE',
operator: 'LIKE',
description: `Checks if the STRING in the first operand X matches a pattern specified by the second operand Y. Expressions can contain these characters:
- A percent sign "%" matches any number of characters or bytes
- An underscore "_" matches a single character or byte
- You can escape "\", "_", or "%" using two backslashes. For example, "\\%". If you are using raw strings, only a single backslash is required. For example, r"\%".`,
},
{ type: OperatorType.Logical, id: 'AND', operator: 'AND' },
{ type: OperatorType.Logical, id: 'OR', operator: 'OR' },
];
export const COMMON_AGGREGATE_FNS = ['AVG', 'COUNT', 'MAX', 'MIN', 'SUM'];
export const MACRO_NAMES = [
'$__time',

View File

@@ -9,7 +9,7 @@ import {
TimeRange,
toOption as toOptionFromData,
} from '@grafana/data';
import { CompletionItemKind, EditorMode, LanguageCompletionProvider } from '@grafana/experimental';
import { CompletionItemKind, EditorMode, LanguageDefinition } from '@grafana/experimental';
import { QueryWithDefaults } from './defaults';
import {
@@ -122,12 +122,6 @@ export interface SQLSelectableValue extends SelectableValue {
raqbFieldType?: RAQBFieldTypes;
}
export interface Aggregate {
id: string;
name: string;
description?: string;
}
export interface DB {
init?: (datasourceId?: string) => Promise<boolean>;
datasets: () => Promise<string[]>;
@@ -136,10 +130,10 @@ export interface DB {
validateQuery: (query: SQLQuery, range?: TimeRange) => Promise<ValidationResults>;
dsID: () => number;
dispose?: (dsID?: string) => void;
lookup: (path?: string) => Promise<Array<{ name: string; completion: string }>>;
getSqlCompletionProvider: () => LanguageCompletionProvider;
lookup?: (path?: string) => Promise<Array<{ name: string; completion: string }>>;
getEditorLanguageDefinition: () => LanguageDefinition;
toRawSql?: (query: SQLQuery) => string;
functions: () => Promise<Aggregate[]>;
functions?: () => string[];
}
export interface QueryEditorProps {
@@ -173,18 +167,3 @@ export interface MetaDefinition {
completion?: string;
kind: CompletionItemKind;
}
export {
CompletionItemKind,
LanguageCompletionProvider,
LinkedToken,
ColumnDefinition,
CompletionItemPriority,
StatementPlacementProvider,
SuggestionKindProvider,
TableDefinition,
TokenType,
OperatorType,
StatementPosition,
PositionContext,
} from '@grafana/experimental';

View File

@@ -0,0 +1,28 @@
import { QueryFormat } from '../types';
import migrateAnnotation from './migration';
describe('Annotation migration', () => {
const annotation = {
datasource: {
uid: 'P4FDCC188E688367F',
type: 'mysql',
},
enable: false,
hide: false,
iconColor: 'rgba(0, 211, 255, 1)',
limit: 100,
name: 'Single',
rawQuery:
"SELECT\n createdAt as time,\n 'single' as text,\n hostname as tags\nFROM\n grafana_metric\nWHERE\n $__timeFilter(createdAt)\nORDER BY time\nLIMIT 1\n",
showIn: 0,
tags: [],
type: 'tags',
};
it('should migrate from old format to new', () => {
const newAnnotationFormat = migrateAnnotation(annotation);
expect(newAnnotationFormat.target?.format).toBe(QueryFormat.Table);
expect(newAnnotationFormat.target?.rawSql).toBe(annotation.rawQuery);
});
});

View File

@@ -1,6 +1,6 @@
import { AnnotationQuery } from '@grafana/data';
import { EditorMode } from '@grafana/experimental';
import { applyQueryDefaults } from '../defaults';
import { SQLQuery } from '../types';
export default function migrateAnnotation(annotation: AnnotationQuery<SQLQuery>) {
@@ -10,12 +10,7 @@ export default function migrateAnnotation(annotation: AnnotationQuery<SQLQuery>)
return annotation;
}
const newQuery: SQLQuery = {
...(annotation.target ?? {}),
refId: annotation.target?.refId ?? 'Anno',
editorMode: EditorMode.Code,
rawSql: oldQuery,
};
const newQuery = applyQueryDefaults({ refId: 'Annotation', ...(annotation.target ?? {}), rawSql: oldQuery });
return {
...annotation,

View File

@@ -1,5 +1,5 @@
import { CompletionItemPriority } from '@grafana/experimental';
import { Monaco, monacoTypes } from '@grafana/ui';
import { CompletionItemPriority } from 'app/features/plugins/sql';
import { afterLabelValue, insideLabelValue } from '../__mocks__/dynamic-label-test-data';
import MonacoMock from '../__mocks__/monarch/Monaco';

View File

@@ -6,16 +6,16 @@ import { findRow, parseResourceURI, setResource } from './utils';
describe('AzureMonitor ResourcePicker utils', () => {
describe('parseResourceURI', () => {
it('should parse subscription URIs', () => {
expect(parseResourceURI('/subscriptions/44693801-6ee6-49de-9b2d-9106972f9572')).toEqual({
subscription: '44693801-6ee6-49de-9b2d-9106972f9572',
expect(parseResourceURI('/subscriptions/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee')).toEqual({
subscription: 'aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee',
});
});
it('should parse resource group URIs', () => {
expect(
parseResourceURI('/subscriptions/44693801-6ee6-49de-9b2d-9106972f9572/resourceGroups/cloud-datasources')
parseResourceURI('/subscriptions/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/resourceGroups/cloud-datasources')
).toEqual({
subscription: '44693801-6ee6-49de-9b2d-9106972f9572',
subscription: 'aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee',
resourceGroup: 'cloud-datasources',
});
});
@@ -23,10 +23,10 @@ describe('AzureMonitor ResourcePicker utils', () => {
it('should parse resource URIs', () => {
expect(
parseResourceURI(
'/subscriptions/44693801-6ee6-49de-9b2d-9106972f9572/resourceGroups/cloud-datasources/providers/Microsoft.Compute/virtualMachines/GithubTestDataVM'
'/subscriptions/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/resourceGroups/cloud-datasources/providers/Microsoft.Compute/virtualMachines/GithubTestDataVM'
)
).toEqual({
subscription: '44693801-6ee6-49de-9b2d-9106972f9572',
subscription: 'aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee',
resourceGroup: 'cloud-datasources',
metricNamespace: 'Microsoft.Compute/virtualMachines',
resourceName: 'GithubTestDataVM',
@@ -36,10 +36,10 @@ describe('AzureMonitor ResourcePicker utils', () => {
it('should parse resource URIs with a subresource', () => {
expect(
parseResourceURI(
'/subscriptions/44693801-6ee6-49de-9b2d-9106972f9572/resourceGroups/cloud-datasources/providers/Microsoft.Storage/storageAccounts/csb100320016c43d2d0/fileServices/default'
'/subscriptions/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/resourceGroups/cloud-datasources/providers/Microsoft.Storage/storageAccounts/csb100320016c43d2d0/fileServices/default'
)
).toEqual({
subscription: '44693801-6ee6-49de-9b2d-9106972f9572',
subscription: 'aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee',
resourceGroup: 'cloud-datasources',
metricNamespace: 'Microsoft.Storage/storageAccounts/fileServices',
resourceName: 'csb100320016c43d2d0/default',
@@ -47,30 +47,30 @@ describe('AzureMonitor ResourcePicker utils', () => {
});
it('returns undefined for invalid input', () => {
expect(parseResourceURI('44693801-6ee6-49de-9b2d-9106972f9572')).toEqual({});
expect(parseResourceURI('aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee')).toEqual({});
});
it('returns a valid response with a missing element in the metric namespace and name', () => {
expect(
parseResourceURI(
'/subscriptions/44693801-6ee6-49de-9b2d-9106972f9572/resourceGroups/cloud-datasources/providers/foo'
'/subscriptions/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/resourceGroups/cloud-datasources/providers/foo'
)
).toEqual({
metricNamespace: 'foo',
resourceGroup: 'cloud-datasources',
resourceName: '',
subscription: '44693801-6ee6-49de-9b2d-9106972f9572',
subscription: 'aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee',
});
expect(
parseResourceURI(
'/subscriptions/44693801-6ee6-49de-9b2d-9106972f9572/resourceGroups/cloud-datasources/providers/foo/bar'
'/subscriptions/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/resourceGroups/cloud-datasources/providers/foo/bar'
)
).toEqual({
metricNamespace: 'foo/bar',
resourceGroup: 'cloud-datasources',
resourceName: '',
subscription: '44693801-6ee6-49de-9b2d-9106972f9572',
subscription: 'aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee',
});
});
});
@@ -87,7 +87,7 @@ describe('AzureMonitor ResourcePicker utils', () => {
const rows: ResourceRowGroup = [
{
id: '',
uri: '/subscriptions/44693801-6ee6-49de-9b2d-9106972f9572/resourceGroups/cloud-datasources/providers/Microsoft.Storage/storageAccounts/csb100320016c43d2d0',
uri: '/subscriptions/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/resourceGroups/cloud-datasources/providers/Microsoft.Storage/storageAccounts/csb100320016c43d2d0',
name: '',
type: ResourceRowType.Resource,
typeLabel: '',
@@ -96,7 +96,7 @@ describe('AzureMonitor ResourcePicker utils', () => {
expect(
findRow(
rows,
'/subscriptions/44693801-6ee6-49de-9b2d-9106972f9572/resourceGroups/cloud-datasources/providers/Microsoft.Storage/storageAccounts/csb100320016c43d2d0/fileServices/default'
'/subscriptions/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/resourceGroups/cloud-datasources/providers/Microsoft.Storage/storageAccounts/csb100320016c43d2d0/fileServices/default'
)
).toEqual(rows[0]);
});
@@ -105,7 +105,7 @@ describe('AzureMonitor ResourcePicker utils', () => {
const rows: ResourceRowGroup = [
{
id: '',
uri: '/subscriptions/44693801-6ee6-49de-9b2d-9106972f9572/resourceGroups/cloud-datasources/providers/microsoft.storage/storageaccounts/csb100320016c43d2d0',
uri: '/subscriptions/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/resourceGroups/cloud-datasources/providers/microsoft.storage/storageaccounts/csb100320016c43d2d0',
name: '',
type: ResourceRowType.Resource,
typeLabel: '',
@@ -114,7 +114,7 @@ describe('AzureMonitor ResourcePicker utils', () => {
expect(
findRow(
rows,
'/subscriptions/44693801-6ee6-49de-9b2d-9106972f9572/resourceGroups/cloud-datasources/providers/Microsoft.Storage/storageAccounts/csb100320016c43d2d0'
'/subscriptions/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/resourceGroups/cloud-datasources/providers/Microsoft.Storage/storageAccounts/csb100320016c43d2d0'
)
).toEqual(rows[0]);
});
@@ -123,7 +123,7 @@ describe('AzureMonitor ResourcePicker utils', () => {
const rows: ResourceRowGroup = [
{
id: '',
uri: '/subscriptions/44693801-6ee6-49de-9b2d-9106972f9572/resourceGroups/CLOUD-DATASOURCES/providers/microsoft.storage/storageaccounts/csb100320016c43d2d0',
uri: '/subscriptions/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/resourceGroups/CLOUD-DATASOURCES/providers/microsoft.storage/storageaccounts/csb100320016c43d2d0',
name: '',
type: ResourceRowType.Resource,
typeLabel: '',
@@ -132,10 +132,35 @@ describe('AzureMonitor ResourcePicker utils', () => {
expect(
findRow(
rows,
'/subscriptions/44693801-6ee6-49de-9b2d-9106972f9572/resourceGroups/cloud-datasources/providers/Microsoft.Storage/storageAccounts/csb100320016c43d2d0'
'/subscriptions/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/resourceGroups/cloud-datasources/providers/Microsoft.Storage/storageAccounts/csb100320016c43d2d0'
)
).toEqual(rows[0]);
});
it('should find a row matching the right subresource', () => {
const rows: ResourceRowGroup = [
{
id: '',
uri: '/subscriptions/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/resourceGroups/cloud-datasources/providers/Microsoft.Sql/servers/foo',
name: '',
type: ResourceRowType.Resource,
typeLabel: '',
},
{
id: '',
uri: '/subscriptions/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/resourceGroups/cloud-datasources/providers/Microsoft.Sql/servers/foo/databases/bar',
name: '',
type: ResourceRowType.Resource,
typeLabel: '',
},
];
expect(
findRow(
rows,
'/subscriptions/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/resourceGroups/cloud-datasources/providers/Microsoft.Sql/servers/foo/databases/bar'
)
).toEqual(rows[1]);
});
});
describe('setResource', () => {

View File

@@ -62,6 +62,22 @@ export function isGUIDish(input: string) {
return !!input.match(/^[A-Z0-9]+/i);
}
function compareNamespaceAndName(
rowNamespace?: string,
rowName?: string,
resourceNamespace?: string,
resourceName?: string
) {
// StorageAccounts subresources are not listed independently
if (resourceNamespace?.startsWith('microsoft.storage/storageaccounts')) {
resourceNamespace = 'microsoft.storage/storageaccounts';
if (resourceName?.endsWith('/default')) {
resourceName = resourceName.slice(0, -'/default'.length);
}
}
return rowNamespace === resourceNamespace && rowName === resourceName;
}
function matchURI(rowURI: string, resourceURI: string) {
const targetParams = parseResourceDetails(resourceURI);
const rowParams = parseResourceDetails(rowURI);
@@ -69,11 +85,12 @@ function matchURI(rowURI: string, resourceURI: string) {
return (
rowParams?.subscription === targetParams?.subscription &&
rowParams?.resourceGroup?.toLowerCase() === targetParams?.resourceGroup?.toLowerCase() &&
// metricNamespace may include a subresource that we don't need to compare
rowParams?.metricNamespace?.toLowerCase().split('/')[0] ===
targetParams?.metricNamespace?.toLowerCase().split('/')[0] &&
// resourceName may include a subresource that we don't need to compare
rowParams?.resourceName?.split('/')[0] === targetParams?.resourceName?.split('/')[0]
compareNamespaceAndName(
rowParams?.metricNamespace?.toLowerCase(),
rowParams?.resourceName,
targetParams?.metricNamespace?.toLowerCase(),
targetParams?.resourceName
)
);
}

View File

@@ -489,6 +489,38 @@ describe('VariableSupport', () => {
});
});
it('passes on the query error for a log query', (done) => {
const variableSupport = new VariableSupport(
createMockDatasource({
query: () =>
from(
Promise.resolve({
data: [],
error: {
message: 'boom',
},
})
),
})
);
const mockRequest = {
targets: [
{
queryType: AzureQueryType.LogAnalytics,
azureLogAnalytics: {
query: 'some log thing',
},
} as AzureMonitorQuery,
],
} as DataQueryRequest<AzureMonitorQuery>;
const observables = variableSupport.query(mockRequest);
observables.subscribe((result: DataQueryResponseData) => {
expect(result.data).toEqual([]);
expect(result.error.message).toEqual('boom');
done();
});
});
it('should handle http error', (done) => {
const error = invalidSubscriptionError();
const variableSupport = new VariableSupport(

View File

@@ -94,6 +94,7 @@ export class VariableSupport extends CustomVariableSupport<DataSource, AzureMoni
const queryResp = await lastValueFrom(this.datasource.query(request));
return {
data: queryResp.data,
error: queryResp.error ? new Error(messageFromError(queryResp.error)) : undefined,
};
}
} catch (err) {

View File

@@ -1,14 +1,9 @@
import { DataSourceInstanceSettings, ScopedVars } from '@grafana/data';
import { LanguageDefinition } from '@grafana/experimental';
import { TemplateSrv } from '@grafana/runtime';
import { AGGREGATE_FNS } from 'app/features/plugins/sql/constants';
import { SqlDatasource } from 'app/features/plugins/sql/datasource/SqlDatasource';
import {
DB,
LanguageCompletionProvider,
ResponseParser,
SQLQuery,
SQLSelectableValue,
} from 'app/features/plugins/sql/types';
import { DB, ResponseParser, SQLQuery, SQLSelectableValue } from 'app/features/plugins/sql/types';
import { formatSQL } from 'app/features/plugins/sql/utils/formatSQL';
import { getSchema, showDatabases, getSchemaAndName } from './MSSqlMetaQuery';
import { MSSqlQueryModel } from './MSSqlQueryModel';
@@ -18,7 +13,7 @@ import { getIcon, getRAQBType, toRawSql } from './sqlUtil';
import { MssqlOptions } from './types';
export class MssqlDatasource extends SqlDatasource {
completionProvider: LanguageCompletionProvider | undefined = undefined;
sqlLanguageDefinition: LanguageDefinition | undefined = undefined;
constructor(instanceSettings: DataSourceInstanceSettings<MssqlOptions>) {
super(instanceSettings);
}
@@ -59,16 +54,20 @@ export class MssqlDatasource extends SqlDatasource {
return result;
}
getSqlCompletionProvider(db: DB): LanguageCompletionProvider {
if (this.completionProvider !== undefined) {
return this.completionProvider;
getSqlLanguageDefinition(db: DB): LanguageDefinition {
if (this.sqlLanguageDefinition !== undefined) {
return this.sqlLanguageDefinition;
}
const args = {
getColumns: { current: (query: SQLQuery) => fetchColumns(db, query) },
getTables: { current: (dataset?: string) => fetchTables(db, dataset) },
};
this.completionProvider = getSqlCompletionProvider(args);
return this.completionProvider;
this.sqlLanguageDefinition = {
id: 'sql',
completionProvider: getSqlCompletionProvider(args),
formatter: formatSQL,
};
return this.sqlLanguageDefinition;
}
getDB(): DB {
@@ -79,7 +78,7 @@ export class MssqlDatasource extends SqlDatasource {
init: () => Promise.resolve(true),
datasets: () => this.fetchDatasets(),
tables: (dataset?: string) => this.fetchTables(dataset),
getSqlCompletionProvider: () => this.getSqlCompletionProvider(this.db),
getEditorLanguageDefinition: () => this.getSqlLanguageDefinition(this.db),
fields: async (query: SQLQuery) => {
if (!query?.dataset || !query?.table) {
return [];
@@ -94,7 +93,7 @@ export class MssqlDatasource extends SqlDatasource {
lookup: async (path?: string) => {
if (!path) {
const datasets = await this.fetchDatasets();
return datasets.map((d) => ({ name: d, completion: d }));
return datasets.map((d) => ({ name: d, completion: `${d}.` }));
} else {
const parts = path.split('.').filter((s: string) => s);
if (parts.length > 2) {
@@ -108,7 +107,6 @@ export class MssqlDatasource extends SqlDatasource {
}
}
},
functions: async () => AGGREGATE_FNS,
};
}
}

View File

@@ -1,18 +1,13 @@
import { TableIdentifier } from '@grafana/experimental';
import { AGGREGATE_FNS, OPERATORS } from 'app/features/plugins/sql/constants';
import {
ColumnDefinition,
CompletionItemKind,
CompletionItemPriority,
DB,
getStandardSQLCompletionProvider,
LanguageCompletionProvider,
LinkedToken,
SQLQuery,
StatementPlacementProvider,
SuggestionKindProvider,
TableDefinition,
TableIdentifier,
TokenType,
} from 'app/features/plugins/sql/types';
} from '@grafana/experimental';
import { DB, SQLQuery } from 'app/features/plugins/sql/types';
interface CompletionProviderGetterArgs {
getColumns: React.MutableRefObject<(t: SQLQuery) => Promise<ColumnDefinition[]>>;
@@ -21,13 +16,17 @@ interface CompletionProviderGetterArgs {
export const getSqlCompletionProvider: (args: CompletionProviderGetterArgs) => LanguageCompletionProvider =
({ getColumns, getTables }) =>
() => ({
triggerCharacters: ['.', ' ', '$', ',', '(', "'"],
(monaco, language) => ({
...(language && getStandardSQLCompletionProvider(monaco, language)),
tables: {
resolve: async () => {
return await getTables.current();
resolve: async (identifier) => {
return await getTables.current(identifier.table);
},
parseName: (token: LinkedToken) => {
if (!token) {
return { table: '' };
}
let processedToken = token;
let tablePath = processedToken.value;
@@ -36,6 +35,10 @@ export const getSqlCompletionProvider: (args: CompletionProviderGetterArgs) => L
processedToken = processedToken.next;
}
if (processedToken.value.endsWith('.')) {
tablePath = processedToken.value.slice(0, processedToken.value.length - 1);
}
return { table: tablePath };
},
},
@@ -50,74 +53,8 @@ export const getSqlCompletionProvider: (args: CompletionProviderGetterArgs) => L
return await getColumns.current({ table: `${schema}.${tableName}`, dataset: database, refId: 'A' });
},
},
supportedFunctions: () => AGGREGATE_FNS,
supportedOperators: () => OPERATORS,
customSuggestionKinds: customSuggestionKinds(getTables, getColumns),
customStatementPlacement,
});
export enum CustomStatementPlacement {
AfterDatabase = 'afterDatabase',
}
export enum CustomSuggestionKind {
TablesWithinDatabase = 'tablesWithinDatabase',
}
export const customStatementPlacement: StatementPlacementProvider = () => [
{
id: CustomStatementPlacement.AfterDatabase,
resolve: (currentToken, previousKeyword) => {
return Boolean(
currentToken?.is(TokenType.Delimiter, '.') ||
(currentToken?.is(TokenType.Whitespace) && currentToken?.previous?.is(TokenType.Delimiter, '.')) ||
(currentToken?.isNumber() && currentToken.value.endsWith('.'))
);
},
},
];
export const customSuggestionKinds: (
getTables: CompletionProviderGetterArgs['getTables'],
getFields: CompletionProviderGetterArgs['getColumns']
) => SuggestionKindProvider = (getTables) => () =>
[
{
id: CustomSuggestionKind.TablesWithinDatabase,
applyTo: [CustomStatementPlacement.AfterDatabase],
suggestionsResolver: async (ctx) => {
const tablePath = ctx.currentToken ? getDatabaseName(ctx.currentToken) : '';
const t = await getTables.current(tablePath);
return t.map((table) => ({
label: table.name,
insertText: table.completion ?? table.name,
command: { id: 'editor.action.triggerSuggest', title: '' },
kind: CompletionItemKind.Field,
sortText: CompletionItemPriority.High,
range: {
...ctx.range,
startColumn: ctx.range.endColumn,
endColumn: ctx.range.endColumn,
},
}));
},
},
];
export function getDatabaseName(token: LinkedToken) {
let processedToken = token;
let database = '';
while (processedToken?.previous && !processedToken.previous.isWhiteSpace()) {
processedToken = processedToken.previous;
database = processedToken.value + database;
}
database = database.trim();
return database;
}
export async function fetchColumns(db: DB, q: SQLQuery) {
const cols = await db.fields(q);
if (cols.length > 0) {
@@ -130,6 +67,6 @@ export async function fetchColumns(db: DB, q: SQLQuery) {
}
export async function fetchTables(db: DB, dataset?: string) {
const tables = await db.lookup(dataset);
return tables;
const tables = await db.lookup?.(dataset);
return tables || [];
}

View File

@@ -1,29 +1,24 @@
import { DataSourceInstanceSettings, ScopedVars, TimeRange } from '@grafana/data';
import { CompletionItemKind, LanguageDefinition, TableIdentifier } from '@grafana/experimental';
import { TemplateSrv } from '@grafana/runtime';
import { SqlDatasource } from 'app/features/plugins/sql/datasource/SqlDatasource';
import {
CompletionItemKind,
DB,
LanguageCompletionProvider,
ResponseParser,
SQLQuery,
} from 'app/features/plugins/sql/types';
import { DB, ResponseParser, SQLQuery } from 'app/features/plugins/sql/types';
import { formatSQL } from 'app/features/plugins/sql/utils/formatSQL';
import MySQLQueryModel from './MySqlQueryModel';
import MySqlResponseParser from './MySqlResponseParser';
import { mapFieldsToTypes } from './fields';
import { buildColumnQuery, buildTableQuery, showDatabases } from './mySqlMetaQuery';
import { fetchColumns, fetchTables, getFunctions, getSqlCompletionProvider } from './sqlCompletionProvider';
import { getSqlCompletionProvider } from './sqlCompletionProvider';
import { MySQLOptions } from './types';
export class MySqlDatasource extends SqlDatasource {
responseParser: MySqlResponseParser;
completionProvider: LanguageCompletionProvider | undefined;
sqlLanguageDefinition: LanguageDefinition | undefined;
constructor(private instanceSettings: DataSourceInstanceSettings<MySQLOptions>) {
super(instanceSettings);
this.responseParser = new MySqlResponseParser();
this.completionProvider = undefined;
}
getQueryModel(target?: Partial<SQLQuery>, templateSrv?: TemplateSrv, scopedVars?: ScopedVars): MySQLQueryModel {
@@ -34,19 +29,20 @@ export class MySqlDatasource extends SqlDatasource {
return this.responseParser;
}
getSqlCompletionProvider(db: DB): LanguageCompletionProvider {
if (this.completionProvider !== undefined) {
return this.completionProvider;
getSqlLanguageDefinition(db: DB): LanguageDefinition {
if (this.sqlLanguageDefinition !== undefined) {
return this.sqlLanguageDefinition;
}
const args = {
getColumns: { current: (query: SQLQuery) => fetchColumns(db, query) },
getTables: { current: (dataset?: string) => fetchTables(db, { dataset }) },
fetchMeta: { current: (path?: string) => this.fetchMeta(path) },
getFunctions: { current: () => getFunctions() },
getMeta: { current: (identifier?: TableIdentifier) => this.fetchMeta(identifier) },
};
this.completionProvider = getSqlCompletionProvider(args);
return this.completionProvider;
this.sqlLanguageDefinition = {
id: 'sql',
completionProvider: getSqlCompletionProvider(args),
formatter: formatSQL,
};
return this.sqlLanguageDefinition;
}
async fetchDatasets(): Promise<string[]> {
@@ -69,28 +65,20 @@ export class MySqlDatasource extends SqlDatasource {
return mapFieldsToTypes(fields);
}
async fetchMeta(path?: string) {
async fetchMeta(identifier?: TableIdentifier) {
const defaultDB = this.instanceSettings.jsonData.database;
path = path?.trim();
if (!path && defaultDB) {
if (!identifier?.schema && defaultDB) {
const tables = await this.fetchTables(defaultDB);
return tables.map((t) => ({ name: t, completion: t, kind: CompletionItemKind.Class }));
} else if (!path) {
return tables.map((t) => ({ name: t, completion: `${defaultDB}.${t}`, kind: CompletionItemKind.Class }));
} else if (!identifier?.schema && !defaultDB) {
const datasets = await this.fetchDatasets();
return datasets.map((d) => ({ name: d, completion: `${d}.`, kind: CompletionItemKind.Module }));
} else {
const parts = path.split('.').filter((s: string) => s);
if (parts.length > 2) {
return [];
}
if (parts.length === 1 && !defaultDB) {
const tables = await this.fetchTables(parts[0]);
if (!identifier?.table && !defaultDB) {
const tables = await this.fetchTables(identifier?.schema);
return tables.map((t) => ({ name: t, completion: t, kind: CompletionItemKind.Class }));
} else if (parts.length === 1 && defaultDB) {
const fields = await this.fetchFields({ dataset: defaultDB, table: parts[0] });
return fields.map((t) => ({ name: t.value, completion: t.value, kind: CompletionItemKind.Field }));
} else if (parts.length === 2 && !defaultDB) {
const fields = await this.fetchFields({ dataset: parts[0], table: parts[1] });
} else if (identifier?.table && identifier.schema) {
const fields = await this.fetchFields({ dataset: identifier.schema, table: identifier.table });
return fields.map((t) => ({ name: t.value, completion: t.value, kind: CompletionItemKind.Field }));
} else {
return [];
@@ -109,9 +97,8 @@ export class MySqlDatasource extends SqlDatasource {
validateQuery: (query: SQLQuery, range?: TimeRange) =>
Promise.resolve({ query, error: '', isError: false, isValid: true }),
dsID: () => this.id,
lookup: (path?: string) => this.fetchMeta(path),
getSqlCompletionProvider: () => this.getSqlCompletionProvider(this.db),
functions: async () => getFunctions(),
functions: () => ['VARIANCE', 'STDDEV'],
getEditorLanguageDefinition: () => this.getSqlLanguageDefinition(this.db),
};
}
}

View File

@@ -1,20 +0,0 @@
export const FUNCTIONS = [
{
id: 'STDDEV',
name: 'STDDEV',
description: `STDDEV(
expression
)
Returns the standard deviation of non-NULL input values, or NaN if the input contains a NaN.`,
},
{
id: 'VARIANCE',
name: 'VARIANCE',
description: `VARIANCE(
expression
)
Returns the variance of non-NULL input values, or NaN if the input contains a NaN.`,
},
];

View File

@@ -1,284 +1,22 @@
import { AGGREGATE_FNS, OPERATORS } from 'app/features/plugins/sql/constants';
import {
Aggregate,
ColumnDefinition,
CompletionItemKind,
CompletionItemPriority,
DB,
getStandardSQLCompletionProvider,
LanguageCompletionProvider,
LinkedToken,
MetaDefinition,
PositionContext,
SQLQuery,
StatementPlacementProvider,
StatementPosition,
SuggestionKindProvider,
TableDefinition,
TokenType,
} from 'app/features/plugins/sql/types';
import { FUNCTIONS } from './functions';
TableIdentifier,
} from '@grafana/experimental';
interface CompletionProviderGetterArgs {
getColumns: React.MutableRefObject<(t: SQLQuery) => Promise<ColumnDefinition[]>>;
getTables: React.MutableRefObject<(d?: string) => Promise<TableDefinition[]>>;
fetchMeta: React.MutableRefObject<(d?: string) => Promise<MetaDefinition[]>>;
getFunctions: React.MutableRefObject<(d?: string) => Aggregate[]>;
getMeta: React.MutableRefObject<(t?: TableIdentifier) => Promise<TableDefinition[]>>;
}
export const getSqlCompletionProvider: (args: CompletionProviderGetterArgs) => LanguageCompletionProvider =
({ getColumns, getTables, fetchMeta, getFunctions }) =>
() => ({
triggerCharacters: ['.', ' ', '$', ',', '(', "'"],
supportedFunctions: () => getFunctions.current(),
supportedOperators: () => OPERATORS,
customSuggestionKinds: customSuggestionKinds(getTables, getColumns, fetchMeta),
customStatementPlacement,
({ getMeta }) =>
(monaco, language) => ({
...(language && getStandardSQLCompletionProvider(monaco, language)),
tables: {
resolve: getMeta.current,
},
columns: {
resolve: getMeta.current,
},
});
export enum CustomStatementPlacement {
AfterDataset = 'afterDataset',
AfterFrom = 'afterFrom',
AfterSelect = 'afterSelect',
}
export enum CustomSuggestionKind {
TablesWithinDataset = 'tablesWithinDataset',
}
export enum Direction {
Next = 'next',
Previous = 'previous',
}
const TRIGGER_SUGGEST = 'editor.action.triggerSuggest';
enum Keyword {
Select = 'SELECT',
Where = 'WHERE',
From = 'FROM',
}
export const customStatementPlacement: StatementPlacementProvider = () => [
{
id: CustomStatementPlacement.AfterDataset,
resolve: (currentToken, previousKeyword) => {
return Boolean(
currentToken?.is(TokenType.Delimiter, '.') ||
(currentToken?.is(TokenType.Whitespace) && currentToken?.previous?.is(TokenType.Delimiter, '.'))
);
},
},
{
id: CustomStatementPlacement.AfterFrom,
resolve: (currentToken, previousKeyword) => {
return Boolean(isAfterFrom(currentToken));
},
},
{
id: CustomStatementPlacement.AfterSelect,
resolve: (token, previousKeyword) => {
const is =
isDirectlyAfter(token, Keyword.Select) ||
(isAfterSelect(token) && token?.previous?.is(TokenType.Delimiter, ','));
return Boolean(is);
},
},
];
export const customSuggestionKinds: (
getTables: CompletionProviderGetterArgs['getTables'],
getFields: CompletionProviderGetterArgs['getColumns'],
fetchMeta: CompletionProviderGetterArgs['fetchMeta']
) => SuggestionKindProvider = (getTables, _, fetchMeta) => () =>
[
{
id: CustomSuggestionKind.TablesWithinDataset,
applyTo: [CustomStatementPlacement.AfterDataset],
suggestionsResolver: async (ctx) => {
const tablePath = ctx.currentToken ? getTablePath(ctx.currentToken) : '';
const t = await getTables.current(tablePath);
return t.map((table) => suggestion(table.name, table.completion ?? table.name, CompletionItemKind.Field, ctx));
},
},
{
id: 'metaAfterSelect',
applyTo: [CustomStatementPlacement.AfterSelect],
suggestionsResolver: async (ctx) => {
const path = getPath(ctx.currentToken, Direction.Next);
const t = await fetchMeta.current(path);
return t.map((meta) => {
const completion = meta.kind === CompletionItemKind.Class ? `${meta.completion}.` : meta.completion;
return suggestion(meta.name, completion!, meta.kind, ctx);
});
},
},
{
id: 'metaAfterSelectFuncArg',
applyTo: [StatementPosition.AfterSelectFuncFirstArgument],
suggestionsResolver: async (ctx) => {
const path = getPath(ctx.currentToken, Direction.Next);
const t = await fetchMeta.current(path);
return t.map((meta) => {
const completion = meta.kind === CompletionItemKind.Class ? `${meta.completion}.` : meta.completion;
return suggestion(meta.name, completion!, meta.kind, ctx);
});
},
},
{
id: 'metaAfterFrom',
applyTo: [CustomStatementPlacement.AfterFrom],
suggestionsResolver: async (ctx) => {
// TODO: why is this triggering when isAfterFrom is false
if (!isAfterFrom(ctx.currentToken)) {
return [];
}
const path = ctx.currentToken?.value || '';
const t = await fetchMeta.current(path);
return t.map((meta) => suggestion(meta.name, meta.completion!, meta.kind, ctx));
},
},
{
id: `MYSQL${StatementPosition.WhereKeyword}`,
applyTo: [StatementPosition.WhereKeyword],
suggestionsResolver: async (ctx) => {
const path = getPath(ctx.currentToken, Direction.Previous);
const t = await fetchMeta.current(path);
return t.map((meta) => {
const completion = meta.kind === CompletionItemKind.Class ? `${meta.completion}.` : meta.completion;
return suggestion(meta.name, completion!, meta.kind, ctx);
});
},
},
{
id: StatementPosition.WhereComparisonOperator,
applyTo: [StatementPosition.WhereComparisonOperator],
suggestionsResolver: async (ctx) => {
if (!isAfterWhere(ctx.currentToken)) {
return [];
}
const path = getPath(ctx.currentToken, Direction.Previous);
const t = await fetchMeta.current(path);
const sugg = t.map((meta) => {
const completion = meta.kind === CompletionItemKind.Class ? `${meta.completion}.` : meta.completion;
return suggestion(meta.name, completion!, meta.kind, ctx);
});
return sugg;
},
},
];
function getPath(token: LinkedToken | null, direction: Direction) {
let path = token?.value || '';
const fromValue = keywordValue(token, Keyword.From, direction);
if (fromValue) {
path = fromValue;
}
return path;
}
export function getTablePath(token: LinkedToken) {
let processedToken = token;
let tablePath = '';
while (processedToken?.previous && !processedToken.previous.isWhiteSpace()) {
processedToken = processedToken.previous;
tablePath = processedToken.value + tablePath;
}
tablePath = tablePath.trim();
return tablePath;
}
function suggestion(label: string, completion: string, kind: CompletionItemKind, ctx: PositionContext) {
return {
label,
insertText: completion,
command: { id: TRIGGER_SUGGEST, title: '' },
kind,
sortText: CompletionItemPriority.High,
range: {
...ctx.range,
startColumn: ctx.range.endColumn,
endColumn: ctx.range.endColumn,
},
};
}
function isAfterSelect(token: LinkedToken | null) {
return isAfterKeyword(token, Keyword.Select);
}
function isAfterFrom(token: LinkedToken | null) {
return isDirectlyAfter(token, Keyword.From);
}
function isAfterWhere(token: LinkedToken | null) {
return isAfterKeyword(token, Keyword.Where);
}
function isAfterKeyword(token: LinkedToken | null, keyword: string) {
if (!token?.is(TokenType.Keyword)) {
let curToken = token;
while (true) {
if (!curToken) {
return false;
}
if (curToken.is(TokenType.Keyword, keyword)) {
return true;
}
if (curToken.isKeyword()) {
return false;
}
curToken = curToken?.previous || null;
}
}
return false;
}
function isDirectlyAfter(token: LinkedToken | null, keyword: string) {
return token?.is(TokenType.Whitespace) && token?.previous?.is(TokenType.Keyword, keyword);
}
function keywordValue(token: LinkedToken | null, keyword: Keyword, direction: Direction) {
let next = token;
while (next) {
if (next.is(TokenType.Keyword, keyword)) {
return tokenValue(next);
}
next = next[direction];
}
return false;
}
function tokenValue(token: LinkedToken | null): string | undefined {
const ws = token?.next;
if (ws?.isWhiteSpace()) {
const v = ws.next;
const delim = v?.next;
if (!delim?.is(TokenType.Delimiter)) {
return v?.value;
}
return `${v?.value}${delim?.value}${delim.next?.value}`;
}
return undefined;
}
export async function fetchColumns(db: DB, q: SQLQuery) {
const cols = await db.fields(q);
if (cols.length > 0) {
return cols.map((c) => {
return { name: c.value, type: c.value, description: c.value };
});
} else {
return [];
}
}
export async function fetchTables(db: DB, q: Partial<SQLQuery>) {
const tables = await db.lookup(q.dataset);
return tables;
}
export function getFunctions(): Aggregate[] {
return [...AGGREGATE_FNS, ...FUNCTIONS];
}

View File

@@ -153,10 +153,16 @@ export const PostgresConfigEditor = (props: DataSourcePluginOptionsEditorProps<P
) : null}
</FieldSet>
{options.jsonData.sslmode !== 'disable' ? (
{jsonData.sslmode !== PostgresTLSModes.disable ? (
<FieldSet label="TLS/SSL Auth Details">
{options.jsonData.tlsConfigurationMethod === PostgresTLSMethods.fileContent ? (
<TLSSecretsConfig editorProps={props} labelWidth={labelWidthSSLDetails}></TLSSecretsConfig>
{jsonData.tlsConfigurationMethod === PostgresTLSMethods.fileContent ? (
<TLSSecretsConfig
showCACert={
jsonData.sslmode === PostgresTLSModes.verifyCA || jsonData.sslmode === PostgresTLSModes.verifyFull
}
editorProps={props}
labelWidth={labelWidthSSLDetails}
></TLSSecretsConfig>
) : (
<>
<InlineField

View File

@@ -457,16 +457,21 @@ export function getConfig(opts: BarsOptions, theme: GrafanaTheme2) {
let isHovered = hRect && seriesIdx === hRect.sidx;
let heightReduce = 0;
let widthReduce = 0;
// get height of bar rect at same index of the series below the hovered one
if (isStacked && isHovered && hRect!.sidx > 1) {
heightReduce = findRect(qt, hRect!.sidx - 1, hRect!.didx)!.h;
if (isXHorizontal) {
heightReduce = findRect(qt, hRect!.sidx - 1, hRect!.didx)!.h;
} else {
widthReduce = findRect(qt, hRect!.sidx - 1, hRect!.didx)!.w;
}
}
return {
left: isHovered ? hRect!.x / devicePixelRatio : -10,
left: isHovered ? (hRect!.x + widthReduce) / devicePixelRatio : -10,
top: isHovered ? hRect!.y / devicePixelRatio : -10,
width: isHovered ? hRect!.w / devicePixelRatio : 0,
width: isHovered ? (hRect!.w - widthReduce) / devicePixelRatio : 0,
height: isHovered ? (hRect!.h - heightReduce) / devicePixelRatio : 0,
};
},

View File

@@ -1,6 +1,6 @@
import React, { useMemo } from 'react';
import { Field, PanelProps, getLinksSupplier } from '@grafana/data';
import { Field, PanelProps } from '@grafana/data';
import { PanelDataErrorView } from '@grafana/runtime';
import { TooltipDisplayMode } from '@grafana/schema';
import { usePanelContext, TimeSeries, TooltipPlugin, ZoomPlugin, KeyboardPlugin } from '@grafana/ui';
@@ -14,7 +14,7 @@ import { ExemplarsPlugin } from './plugins/ExemplarsPlugin';
import { OutsideRangePlugin } from './plugins/OutsideRangePlugin';
import { ThresholdControlsPlugin } from './plugins/ThresholdControlsPlugin';
import { TimeSeriesOptions } from './types';
import { getTimezones, prepareGraphableFields } from './utils';
import { getTimezones, prepareGraphableFields, regenerateLinksSupplier } from './utils';
interface TimeSeriesPanelProps extends PanelProps<TimeSeriesOptions> {}
@@ -65,15 +65,11 @@ export const TimeSeriesPanel: React.FC<TimeSeriesPanelProps> = ({
options={options}
>
{(config, alignedDataFrame) => {
alignedDataFrame.fields.forEach((field) => {
field.getLinks = getLinksSupplier(
alignedDataFrame,
field,
field.state!.scopedVars!,
replaceVariables,
timeZone
);
});
if (
alignedDataFrame.fields.filter((f) => f.config.links !== undefined && f.config.links.length > 0).length > 0
) {
alignedDataFrame = regenerateLinksSupplier(alignedDataFrame, frames, replaceVariables, timeZone);
}
return (
<>

View File

@@ -43,7 +43,7 @@ describe('prepare timeseries graph', () => {
const frames = prepareGraphableFields(input, createTheme());
const out = frames![0];
expect(out.fields.map((f) => f.name)).toEqual(['a', 'c', 'd']);
expect(out.fields.map((f) => f.name)).toEqual(['a', 'b', 'c', 'd']);
const field = out.fields.find((f) => f.name === 'c');
expect(field?.display).toBeDefined();

View File

@@ -4,8 +4,11 @@ import {
Field,
FieldType,
getDisplayProcessor,
getLinksSupplier,
GrafanaTheme2,
InterpolateFunction,
isBooleanUnit,
SortedVector,
TimeRange,
} from '@grafana/data';
import { GraphFieldConfig, LineInterpolation } from '@grafana/schema';
@@ -60,6 +63,14 @@ export function prepareGraphableFields(
),
};
fields.push(copy);
break; // ok
case FieldType.string:
copy = {
...field,
values: new ArrayVector(field.values.toArray()),
};
fields.push(copy);
break; // ok
case FieldType.boolean:
@@ -123,3 +134,46 @@ export function getTimezones(timezones: string[] | undefined, defaultTimezone: s
}
return timezones.map((v) => (v?.length ? v : defaultTimezone));
}
export function regenerateLinksSupplier(
alignedDataFrame: DataFrame,
frames: DataFrame[],
replaceVariables: InterpolateFunction,
timeZone: string
): DataFrame {
alignedDataFrame.fields.forEach((field) => {
const frameIndex = field.state?.origin?.frameIndex;
if (frameIndex === undefined) {
return;
}
const frame = frames[frameIndex];
const tempFields: Field[] = [];
/* check if field has sortedVector values
if it does, sort all string fields in the original frame by the order array already used for the field
otherwise just attach the fields to the temporary frame used to get the links
*/
for (const frameField of frame.fields) {
if (frameField.type === FieldType.string) {
if (field.values instanceof SortedVector) {
const copiedField = { ...frameField };
copiedField.values = new SortedVector(frameField.values, field.values.getOrderArray());
tempFields.push(copiedField);
} else {
tempFields.push(frameField);
}
}
}
const tempFrame: DataFrame = {
fields: [...alignedDataFrame.fields, ...tempFields],
length: alignedDataFrame.fields.length + tempFields.length,
};
field.getLinks = getLinksSupplier(tempFrame, field, field.state!.scopedVars!, replaceVariables, timeZone);
});
return alignedDataFrame;
}

View File

@@ -5325,9 +5325,9 @@ __metadata:
version: 0.0.0-use.local
resolution: "@grafana-plugins/input-datasource@workspace:plugins-bundled/internal/input-datasource"
dependencies:
"@grafana/data": 9.2.6
"@grafana/toolkit": 9.2.6
"@grafana/ui": 9.2.6
"@grafana/data": 9.2.7
"@grafana/toolkit": 9.2.7
"@grafana/ui": 9.2.7
"@types/jest": 26.0.15
"@types/lodash": 4.14.149
"@types/react": 17.0.30
@@ -5371,12 +5371,12 @@ __metadata:
languageName: node
linkType: hard
"@grafana/data@9.2.6, @grafana/data@workspace:*, @grafana/data@workspace:packages/grafana-data":
"@grafana/data@9.2.7, @grafana/data@workspace:*, @grafana/data@workspace:packages/grafana-data":
version: 0.0.0-use.local
resolution: "@grafana/data@workspace:packages/grafana-data"
dependencies:
"@braintree/sanitize-url": 6.0.0
"@grafana/schema": 9.2.6
"@grafana/schema": 9.2.7
"@grafana/tsconfig": ^1.2.0-rc1
"@rollup/plugin-commonjs": 22.0.1
"@rollup/plugin-json": 4.1.0
@@ -5435,7 +5435,7 @@ __metadata:
languageName: unknown
linkType: soft
"@grafana/e2e-selectors@9.2.6, @grafana/e2e-selectors@workspace:*, @grafana/e2e-selectors@workspace:packages/grafana-e2e-selectors":
"@grafana/e2e-selectors@9.2.7, @grafana/e2e-selectors@workspace:*, @grafana/e2e-selectors@workspace:packages/grafana-e2e-selectors":
version: 0.0.0-use.local
resolution: "@grafana/e2e-selectors@workspace:packages/grafana-e2e-selectors"
dependencies:
@@ -5461,7 +5461,7 @@ __metadata:
"@babel/core": 7.19.0
"@babel/preset-env": 7.19.0
"@cypress/webpack-preprocessor": 5.12.0
"@grafana/e2e-selectors": 9.2.6
"@grafana/e2e-selectors": 9.2.7
"@grafana/tsconfig": ^1.2.0-rc1
"@mochajs/json-file-reporter": ^1.2.0
"@rollup/plugin-node-resolve": 13.3.0
@@ -5548,15 +5548,15 @@ __metadata:
languageName: node
linkType: hard
"@grafana/runtime@9.2.6, @grafana/runtime@workspace:*, @grafana/runtime@workspace:packages/grafana-runtime":
"@grafana/runtime@9.2.7, @grafana/runtime@workspace:*, @grafana/runtime@workspace:packages/grafana-runtime":
version: 0.0.0-use.local
resolution: "@grafana/runtime@workspace:packages/grafana-runtime"
dependencies:
"@grafana/agent-web": ^0.4.0
"@grafana/data": 9.2.6
"@grafana/e2e-selectors": 9.2.6
"@grafana/data": 9.2.7
"@grafana/e2e-selectors": 9.2.7
"@grafana/tsconfig": ^1.2.0-rc1
"@grafana/ui": 9.2.6
"@grafana/ui": 9.2.7
"@rollup/plugin-commonjs": 22.0.1
"@rollup/plugin-node-resolve": 13.3.0
"@sentry/browser": 6.19.7
@@ -5592,7 +5592,7 @@ __metadata:
languageName: unknown
linkType: soft
"@grafana/schema@9.2.6, @grafana/schema@workspace:*, @grafana/schema@workspace:packages/grafana-schema":
"@grafana/schema@9.2.7, @grafana/schema@workspace:*, @grafana/schema@workspace:packages/grafana-schema":
version: 0.0.0-use.local
resolution: "@grafana/schema@workspace:packages/grafana-schema"
dependencies:
@@ -5612,7 +5612,7 @@ __metadata:
languageName: unknown
linkType: soft
"@grafana/toolkit@9.2.6, @grafana/toolkit@workspace:*, @grafana/toolkit@workspace:packages/grafana-toolkit":
"@grafana/toolkit@9.2.7, @grafana/toolkit@workspace:*, @grafana/toolkit@workspace:packages/grafana-toolkit":
version: 0.0.0-use.local
resolution: "@grafana/toolkit@workspace:packages/grafana-toolkit"
dependencies:
@@ -5628,10 +5628,10 @@ __metadata:
"@babel/preset-env": 7.18.9
"@babel/preset-react": 7.18.6
"@babel/preset-typescript": 7.18.6
"@grafana/data": 9.2.6
"@grafana/data": 9.2.7
"@grafana/eslint-config": 5.0.0
"@grafana/tsconfig": ^1.2.0-rc1
"@grafana/ui": 9.2.6
"@grafana/ui": 9.2.7
"@jest/core": 27.5.1
"@types/command-exists": ^1.2.0
"@types/eslint": 8.4.1
@@ -5712,16 +5712,16 @@ __metadata:
languageName: node
linkType: hard
"@grafana/ui@9.2.6, @grafana/ui@workspace:*, @grafana/ui@workspace:packages/grafana-ui":
"@grafana/ui@9.2.7, @grafana/ui@workspace:*, @grafana/ui@workspace:packages/grafana-ui":
version: 0.0.0-use.local
resolution: "@grafana/ui@workspace:packages/grafana-ui"
dependencies:
"@babel/core": 7.19.0
"@emotion/css": 11.9.0
"@emotion/react": 11.9.3
"@grafana/data": 9.2.6
"@grafana/e2e-selectors": 9.2.6
"@grafana/schema": 9.2.6
"@grafana/data": 9.2.7
"@grafana/e2e-selectors": 9.2.7
"@grafana/schema": 9.2.7
"@grafana/tsconfig": ^1.2.0-rc1
"@mdx-js/react": 1.6.22
"@monaco-editor/react": 4.4.5
@@ -6004,11 +6004,11 @@ __metadata:
resolution: "@jaegertracing/jaeger-ui-components@workspace:packages/jaeger-ui-components"
dependencies:
"@emotion/css": 11.9.0
"@grafana/data": 9.2.6
"@grafana/e2e-selectors": 9.2.6
"@grafana/runtime": 9.2.6
"@grafana/data": 9.2.7
"@grafana/e2e-selectors": 9.2.7
"@grafana/runtime": 9.2.7
"@grafana/tsconfig": ^1.2.0-rc1
"@grafana/ui": 9.2.6
"@grafana/ui": 9.2.7
"@testing-library/jest-dom": 5.16.4
"@testing-library/react": 12.1.4
"@testing-library/user-event": 14.4.3