Compare commits

..

27 Commits

Author SHA1 Message Date
Larissa Wandzura
e206ef01af cleaned up the table a bit 2025-12-03 14:11:06 -06:00
Larissa Wandzura
06bbf043c0 minor change 2025-11-21 15:37:53 -06:00
Larissa Wandzura
079100b081 ran prettier 2025-11-20 11:19:15 -06:00
Larissa Wandzura
8e11703e94 more updates based on feedback 2025-11-19 17:29:47 -06:00
Larissa Wandzura
03e9ffcc91 Updates based on David's feedback 2025-11-19 16:54:36 -06:00
Larissa Wandzura
d7c75b8343 Update docs/sources/datasources/concepts.md
Co-authored-by: David Harris <david.harris@grafana.com>
2025-11-19 14:22:08 -06:00
Larissa Wandzura
9ddd8a02c0 Update docs/sources/datasources/concepts.md
Co-authored-by: David Harris <david.harris@grafana.com>
2025-11-19 14:21:35 -06:00
Larissa Wandzura
1601995fda Update docs/sources/datasources/concepts.md
Co-authored-by: David Harris <david.harris@grafana.com>
2025-11-19 14:21:23 -06:00
Larissa Wandzura
af87c3d6f3 update based on feedback 2025-11-14 15:14:38 -06:00
Larissa Wandzura
9f9b82f5cf clarified plugin install answer 2025-11-14 14:59:12 -06:00
Larissa Wandzura
04a9888a96 removed Grafana version to avoid confusing users 2025-11-13 16:17:32 -06:00
Larissa Wandzura
07a758d84a updates based on feedback on draft 2025-11-13 16:12:35 -06:00
Larissa Wandzura
111af8b1a8 Update docs/sources/datasources/concepts.md
Co-authored-by: Anna Urbiztondo <anna.urbiztondo@grafana.com>
2025-11-12 13:20:22 -06:00
Larissa Wandzura
4c97e49fc5 cleaned up spelling and punctuation 2025-11-07 16:10:32 -06:00
Larissa Wandzura
10a291ec8b added new concepts doc 2025-11-07 16:04:10 -06:00
beejeebus
0e9fe9dc40 Register external datasource plugins on startup
Current code only registers core datasource k8s api groups.

Add external plugins.

Companion grafana-enterprise PR:

https://github.com/grafana/grafana-enterprise/pull/10125
2025-11-07 14:42:41 -05:00
Paul Marbach
90ddd922ad Chore: Cleanup panelMonitoring feature flag (#113530) 2025-11-07 14:04:42 -05:00
Moustafa Baiou
1e1adafeec Alerting: Add admission hooks for rules app (#113429)
Some checks failed
Frontend performance tests / performance-tests (push) Has been cancelled
Actionlint / Lint GitHub Actions files (push) Has been cancelled
Backend Code Checks / Detect whether code changed (push) Has been cancelled
Backend Code Checks / Validate Backend Configs (push) Has been cancelled
Backend Unit Tests / Detect whether code changed (push) Has been cancelled
Backend Unit Tests / Grafana (1/8) (push) Has been cancelled
Backend Unit Tests / Grafana (2/8) (push) Has been cancelled
Backend Unit Tests / Grafana (3/8) (push) Has been cancelled
Backend Unit Tests / Grafana (4/8) (push) Has been cancelled
Backend Unit Tests / Grafana (5/8) (push) Has been cancelled
Backend Unit Tests / Grafana (6/8) (push) Has been cancelled
Backend Unit Tests / Grafana (7/8) (push) Has been cancelled
Backend Unit Tests / Grafana (8/8) (push) Has been cancelled
Backend Unit Tests / Grafana Enterprise (1/8) (push) Has been cancelled
Backend Unit Tests / Grafana Enterprise (2/8) (push) Has been cancelled
Backend Unit Tests / Grafana Enterprise (3/8) (push) Has been cancelled
Backend Unit Tests / Grafana Enterprise (4/8) (push) Has been cancelled
Backend Unit Tests / Grafana Enterprise (5/8) (push) Has been cancelled
Backend Unit Tests / Grafana Enterprise (6/8) (push) Has been cancelled
Backend Unit Tests / Grafana Enterprise (7/8) (push) Has been cancelled
Backend Unit Tests / Grafana Enterprise (8/8) (push) Has been cancelled
Backend Unit Tests / All backend unit tests complete (push) Has been cancelled
CodeQL checks / Detect whether code changed (push) Has been cancelled
CodeQL checks / Analyze (actions) (push) Has been cancelled
CodeQL checks / Analyze (go) (push) Has been cancelled
CodeQL checks / Analyze (javascript) (push) Has been cancelled
Lint Frontend / Detect whether code changed (push) Has been cancelled
Lint Frontend / Lint (push) Has been cancelled
Lint Frontend / Typecheck (push) Has been cancelled
Lint Frontend / Verify API clients (push) Has been cancelled
Lint Frontend / Verify API clients (enterprise) (push) Has been cancelled
golangci-lint / Detect whether code changed (push) Has been cancelled
golangci-lint / go-fmt (push) Has been cancelled
golangci-lint / lint-go (push) Has been cancelled
Verify i18n / verify-i18n (push) Has been cancelled
End-to-end tests / Detect whether code changed (push) Has been cancelled
End-to-end tests / Build & Package Grafana (push) Has been cancelled
End-to-end tests / Build E2E test runner (push) Has been cancelled
End-to-end tests / push-docker-image (push) Has been cancelled
End-to-end tests / dashboards-suite (old arch) (push) Has been cancelled
End-to-end tests / panels-suite (old arch) (push) Has been cancelled
End-to-end tests / smoke-tests-suite (old arch) (push) Has been cancelled
End-to-end tests / various-suite (old arch) (push) Has been cancelled
End-to-end tests / Verify Storybook (Playwright) (push) Has been cancelled
End-to-end tests / Playwright E2E tests (1/8) (push) Has been cancelled
End-to-end tests / Playwright E2E tests (2/8) (push) Has been cancelled
End-to-end tests / Playwright E2E tests (3/8) (push) Has been cancelled
End-to-end tests / Playwright E2E tests (4/8) (push) Has been cancelled
End-to-end tests / Playwright E2E tests (5/8) (push) Has been cancelled
End-to-end tests / Playwright E2E tests (6/8) (push) Has been cancelled
End-to-end tests / Playwright E2E tests (7/8) (push) Has been cancelled
End-to-end tests / Playwright E2E tests (8/8) (push) Has been cancelled
End-to-end tests / run-azure-monitor-e2e (push) Has been cancelled
End-to-end tests / All Playwright tests complete (push) Has been cancelled
End-to-end tests / A11y test (push) Has been cancelled
End-to-end tests / Publish metrics (push) Has been cancelled
End-to-end tests / All E2E tests complete (push) Has been cancelled
Frontend tests / Detect whether code changed (push) Has been cancelled
Frontend tests / Unit tests (1 / 16) (push) Has been cancelled
Frontend tests / Unit tests (10 / 16) (push) Has been cancelled
Frontend tests / Unit tests (11 / 16) (push) Has been cancelled
Frontend tests / Unit tests (12 / 16) (push) Has been cancelled
Frontend tests / Unit tests (13 / 16) (push) Has been cancelled
Frontend tests / Unit tests (14 / 16) (push) Has been cancelled
Frontend tests / Unit tests (15 / 16) (push) Has been cancelled
Frontend tests / Unit tests (16 / 16) (push) Has been cancelled
Frontend tests / Unit tests (2 / 16) (push) Has been cancelled
Frontend tests / Unit tests (3 / 16) (push) Has been cancelled
Frontend tests / Unit tests (4 / 16) (push) Has been cancelled
Frontend tests / Unit tests (5 / 16) (push) Has been cancelled
Frontend tests / Unit tests (6 / 16) (push) Has been cancelled
Frontend tests / Unit tests (7 / 16) (push) Has been cancelled
Frontend tests / Unit tests (8 / 16) (push) Has been cancelled
Frontend tests / Unit tests (9 / 16) (push) Has been cancelled
Frontend tests / Decoupled plugin tests (push) Has been cancelled
Frontend tests / Packages unit tests (push) Has been cancelled
Frontend tests / All frontend unit tests complete (push) Has been cancelled
Frontend tests / Devenv frontend-service build (push) Has been cancelled
Integration Tests / Detect whether code changed (push) Has been cancelled
Integration Tests / Sqlite (1/4) (push) Has been cancelled
Integration Tests / Sqlite (2/4) (push) Has been cancelled
Integration Tests / Sqlite (3/4) (push) Has been cancelled
Integration Tests / Sqlite (4/4) (push) Has been cancelled
Integration Tests / Sqlite Without CGo (1/4) (push) Has been cancelled
Integration Tests / Sqlite Without CGo (2/4) (push) Has been cancelled
Integration Tests / Sqlite Without CGo (3/4) (push) Has been cancelled
Integration Tests / Sqlite Without CGo (4/4) (push) Has been cancelled
Integration Tests / Sqlite Without CGo (profiled) (push) Has been cancelled
Integration Tests / MySQL (1/16) (push) Has been cancelled
Integration Tests / MySQL (10/16) (push) Has been cancelled
Integration Tests / MySQL (11/16) (push) Has been cancelled
Integration Tests / MySQL (12/16) (push) Has been cancelled
Integration Tests / MySQL (13/16) (push) Has been cancelled
Integration Tests / MySQL (14/16) (push) Has been cancelled
Integration Tests / MySQL (15/16) (push) Has been cancelled
Integration Tests / MySQL (16/16) (push) Has been cancelled
Integration Tests / MySQL (2/16) (push) Has been cancelled
Integration Tests / MySQL (3/16) (push) Has been cancelled
Integration Tests / MySQL (4/16) (push) Has been cancelled
Integration Tests / MySQL (5/16) (push) Has been cancelled
Integration Tests / MySQL (6/16) (push) Has been cancelled
Integration Tests / MySQL (7/16) (push) Has been cancelled
Integration Tests / MySQL (8/16) (push) Has been cancelled
Integration Tests / MySQL (9/16) (push) Has been cancelled
Integration Tests / Postgres (1/16) (push) Has been cancelled
Integration Tests / Postgres (10/16) (push) Has been cancelled
Integration Tests / Postgres (11/16) (push) Has been cancelled
Integration Tests / Postgres (12/16) (push) Has been cancelled
Integration Tests / Postgres (13/16) (push) Has been cancelled
Integration Tests / Postgres (14/16) (push) Has been cancelled
Integration Tests / Postgres (15/16) (push) Has been cancelled
Integration Tests / Postgres (16/16) (push) Has been cancelled
Integration Tests / Postgres (2/16) (push) Has been cancelled
Integration Tests / Postgres (3/16) (push) Has been cancelled
Integration Tests / Postgres (4/16) (push) Has been cancelled
Integration Tests / Postgres (5/16) (push) Has been cancelled
Integration Tests / Postgres (6/16) (push) Has been cancelled
Integration Tests / Postgres (7/16) (push) Has been cancelled
Integration Tests / Postgres (8/16) (push) Has been cancelled
Integration Tests / Postgres (9/16) (push) Has been cancelled
Integration Tests / All backend integration tests complete (push) Has been cancelled
Reject GitHub secrets / reject-gh-secrets (push) Has been cancelled
Build Release Packages / setup (push) Has been cancelled
Build Release Packages / Dispatch grafana-enterprise build (push) Has been cancelled
Build Release Packages / / darwin-amd64 (push) Has been cancelled
Build Release Packages / / darwin-arm64 (push) Has been cancelled
Build Release Packages / / linux-amd64 (push) Has been cancelled
Build Release Packages / / linux-armv6 (push) Has been cancelled
Build Release Packages / / linux-armv7 (push) Has been cancelled
Build Release Packages / / linux-arm64 (push) Has been cancelled
Build Release Packages / / linux-s390x (push) Has been cancelled
Build Release Packages / / windows-amd64 (push) Has been cancelled
Build Release Packages / / windows-arm64 (push) Has been cancelled
Build Release Packages / Upload artifacts (push) Has been cancelled
Build Release Packages / publish-dockerhub (push) Has been cancelled
Build Release Packages / Dispatch publish NPM canaries (push) Has been cancelled
Build Release Packages / notify-pr (push) Has been cancelled
Run dashboard schema v2 e2e / dashboard-schema-v2-e2e (push) Has been cancelled
Shellcheck / Shellcheck scripts (push) Has been cancelled
Run Storybook a11y tests / Detect whether code changed (push) Has been cancelled
Run Storybook a11y tests / Run Storybook a11y tests (light theme) (push) Has been cancelled
Run Storybook a11y tests / Run Storybook a11y tests (dark theme) (push) Has been cancelled
Swagger generated code / Detect whether code changed (push) Has been cancelled
Swagger generated code / Verify committed API specs match (push) Has been cancelled
Dispatch sync to mirror / dispatch-job (push) Has been cancelled
Relyance Compliance Inspection / relyance-compliance-inspector (push) Has been cancelled
Crowdin Download Action / download-sources-from-crowdin (push) Has been cancelled
Close stale issues and PRs / stale (push) Has been cancelled
Crowdin Upload Action / upload-sources-to-crowdin (push) Has been cancelled
publish-kinds-next / main (push) Has been cancelled
This adds validating admission hooks to enforce the requirements on AlertRules and RecordingRules that are currently enforced through the provisioning service and storage mechanisms in preparation of a consistent validation in both legacy storage and unified storage. It also adds a mutating admission hook to the app to ensure that folder annotations and folder labels are kept in sync so we can perform label-selector lists.
2025-11-07 12:01:16 -05:00
Paul Marbach
ecc9e9257e E2E: Prevent issue where certain times can cause test failures (#110196)
* E2E: Prevent issue where certain times can cause test failures

* re-enable first test
2025-11-07 11:34:11 -05:00
Paul Marbach
4fee8b34ad Suggestions: Refactor getPanelDataSummary into its own method (#113251)
* Suggestions: Refactor getPanelDataSummary into its own method

* restore order

* update some imports

* update codeowners
2025-11-07 11:33:13 -05:00
Roberto Jiménez Sánchez
02464c19b8 Provisioning: Add validation for Job specifications (#113590)
* Validate Job Specs

* Add comprehensive unit test coverage for job validator

- Added 8 new test cases to improve coverage from 88.9% to ~100%
- Tests for migrate action without options
- Tests for delete/move actions with resources (missing kind)
- Tests for move action with valid resources
- Tests for move/delete with both paths and resources
- Tests for move action with invalid source paths
- Tests for push action with valid paths

Now covers all validation paths including resource validation and
edge cases for all job action types.

* Add integration tests for job validation

Added comprehensive integration tests that verify the job validator properly
rejects invalid job specifications via the API:

- Test job without action (required field)
- Test job with invalid action
- Test pull job without pull options
- Test push job without push options
- Test push job with invalid branch name (consecutive dots)
- Test push job with path traversal attempt
- Test delete job without paths or resources
- Test delete job with invalid path (path traversal)
- Test move job without target path
- Test move job without paths or resources
- Test move job with invalid target path (path traversal)
- Test migrate job without migrate options
- Test valid pull job to ensure validation doesn't block legitimate requests

These tests verify that the admission controller properly validates job specs
before they are persisted, ensuring security (path traversal prevention) and
data integrity (required fields/options).

* Remove valid job test case from integration tests

Removed the positive test case as it's not necessary for validation testing.
The integration tests now focus solely on verifying that invalid job specs
are properly rejected by the admission controller.

* Fix movejob_test to expect validation error at creation time

Updated the 'move without target path' test to expect the job creation
to fail with a validation error, rather than expecting the job to be
created and then fail during execution.

This aligns with the new job validation logic which rejects invalid
job specs at the API admission control level (422 Unprocessable Entity)
before they can be persisted.

This is better behavior as it prevents invalid jobs from being created
in the first place, rather than allowing them to be created and then
failing during execution.

* Simplify action validation using slices.Contains

Replaced manual loop with slices.Contains for cleaner, more idiomatic Go code.
This reduces code complexity while maintaining the same validation logic.

- Added import for 'slices' package
- Replaced 8-line loop with 1-line slices.Contains call
- All unit tests pass

* Refactor job action validation in ValidateJob function

Removed the hardcoded valid actions array and simplified the validation logic. The function now directly appends an error for invalid actions, improving code clarity and maintainability. This change aligns with the recent updates to job validation, ensuring that invalid job specifications are properly handled.
2025-11-07 16:31:50 +00:00
Sven Grossmann
62129bb91f Search: Change copy to Search with Grafana Assistant (#113609) 2025-11-07 16:27:19 +00:00
Paul Marbach
3d8da61569 E2E: Improve ad-hoc filtering test (#113558)
* E2E: Improve ad-hoc filtering test

* remove unused import

* fix some table e2es after making getCell sync
2025-11-07 11:06:33 -05:00
Misi
d7d296df8e Fix: Return auth labels from /api/users/lookup (#113584)
* wip

* Return auth labels from /api/users/lookup

* Rename

* Address feedback

* Add more tests, fix tests

* Cleanup
2025-11-07 16:51:41 +01:00
Jean-Philippe Quéméner
305ed25896 fix(folders): add a circuit breaker to prevent infinite loops (#113596) 2025-11-07 14:32:17 +00:00
Yunwen Zheng
8b6cc211e9 Git Sync: Allow user disable push to configured branch (#113564)
* Git Sync: Allow user disable push to configured branch
2025-11-07 09:24:34 -05:00
Jean-Philippe Quéméner
1ca95cda4a fix(folders): prevent circular dependencies (#113595) 2025-11-07 14:19:55 +00:00
79 changed files with 3204 additions and 1152 deletions

2
.github/CODEOWNERS vendored
View File

@@ -254,7 +254,6 @@
/devenv/dev-dashboards/all-panels.json @grafana/dataviz-squad
/devenv/dev-dashboards/dashboards.go @grafana/dataviz-squad
/devenv/dev-dashboards/home.json @grafana/dataviz-squad
/devenv/dev-dashboards/datasource-elasticsearch/ @grafana/partner-datasources
/devenv/dev-dashboards/datasource-opentsdb/ @grafana/partner-datasources
/devenv/dev-dashboards/datasource-influxdb/ @grafana/partner-datasources
@@ -550,6 +549,7 @@ i18next.config.ts @grafana/grafana-frontend-platform
/packages/grafana-data/src/geo/ @grafana/dataviz-squad
/packages/grafana-data/src/monaco/ @grafana/partner-datasources
/packages/grafana-data/src/panel/ @grafana/dashboards-squad
/packages/grafana-data/src/panel/suggestions/ @grafana/dataviz-squad
/packages/grafana-data/src/query/ @grafana/grafana-datasources-core-services
/packages/grafana-data/src/rbac/ @grafana/access-squad
/packages/grafana-data/src/table/ @grafana/dataviz-squad

View File

@@ -8,7 +8,16 @@ spec:
preferredVersion: v0alpha1
versions:
- kinds:
- conversion: false
- admission:
mutation:
operations:
- CREATE
- UPDATE
validation:
operations:
- CREATE
- UPDATE
conversion: false
kind: AlertRule
plural: AlertRules
schemas:
@@ -214,7 +223,16 @@ spec:
- spec.panelRef.dashboardUID
- spec.panelRef.panelID
- spec.notificationSettings.receiver
- conversion: false
- admission:
mutation:
operations:
- CREATE
- UPDATE
validation:
operations:
- CREATE
- UPDATE
conversion: false
kind: RecordingRule
plural: RecordingRules
schemas:

View File

@@ -5,6 +5,7 @@ go 1.25.3
require (
github.com/grafana/grafana-app-sdk v0.48.1
github.com/grafana/grafana-app-sdk/logging v0.48.1
github.com/prometheus/common v0.67.1
k8s.io/apimachinery v0.34.1
k8s.io/kube-openapi v0.0.0-20250910181357-589584f1c912
)
@@ -49,7 +50,6 @@ require (
github.com/pmezard/go-difflib v1.0.1-0.20181226105442-5d4384ee4fb2 // indirect
github.com/prometheus/client_golang v1.23.2 // indirect
github.com/prometheus/client_model v0.6.2 // indirect
github.com/prometheus/common v0.67.1 // indirect
github.com/prometheus/procfs v0.16.1 // indirect
github.com/puzpuzpuz/xsync/v2 v2.5.1 // indirect
github.com/rogpeppe/go-internal v1.14.1 // indirect

View File

@@ -13,6 +13,18 @@ alertRulev0alpha1: alertRuleKind & {
schema: {
spec: v0alpha1.AlertRuleSpec
}
validation: {
operations: [
"CREATE",
"UPDATE",
]
}
mutation: {
operations: [
"CREATE",
"UPDATE",
]
}
selectableFields: [
"spec.title",
"spec.paused",

View File

@@ -13,6 +13,18 @@ recordingRulev0alpha1: recordingRuleKind & {
schema: {
spec: v0alpha1.RecordingRuleSpec
}
validation: {
operations: [
"CREATE",
"UPDATE",
]
}
mutation: {
operations: [
"CREATE",
"UPDATE",
]
}
selectableFields: [
"spec.title",
"spec.paused",

View File

@@ -3,6 +3,7 @@ package v0alpha1
import (
"fmt"
"slices"
"time"
)
func (o *AlertRule) GetProvenanceStatus() string {
@@ -48,4 +49,78 @@ func (s *AlertRuleSpec) ExecErrStateOrDefault() string {
return s.ExecErrState
}
// TODO: add duration clamping for the field types AlertRulePromDuration, AlertRulePromDurationWMillis, and the For and KeepFiringFor string pointers
func (d *AlertRulePromDuration) ToDuration() (time.Duration, error) {
return ToDuration(string(*d))
}
func (d *AlertRulePromDurationWMillis) ToDuration() (time.Duration, error) {
return ToDuration(string(*d))
}
func (d *AlertRulePromDuration) Clamp() error {
clampedDuration, err := ClampDuration(string(*d))
if err != nil {
return err
}
*d = AlertRulePromDuration(clampedDuration)
return nil
}
func (d *AlertRulePromDurationWMillis) Clamp() error {
clampedDuration, err := ClampDuration(string(*d))
if err != nil {
return err
}
*d = AlertRulePromDurationWMillis(clampedDuration)
return nil
}
func (spec *AlertRuleSpec) ClampDurations() error {
// clamp all duration fields
if err := spec.Trigger.Interval.Clamp(); err != nil {
return err
}
if spec.For != nil {
clamped, err := ClampDuration(*spec.For)
if err != nil {
return err
}
spec.For = &clamped
}
if spec.KeepFiringFor != nil {
clamped, err := ClampDuration(*spec.KeepFiringFor)
if err != nil {
return err
}
spec.KeepFiringFor = &clamped
}
if spec.NotificationSettings != nil {
if spec.NotificationSettings.GroupWait != nil {
if err := spec.NotificationSettings.GroupWait.Clamp(); err != nil {
return err
}
}
if spec.NotificationSettings.GroupInterval != nil {
if err := spec.NotificationSettings.GroupInterval.Clamp(); err != nil {
return err
}
}
if spec.NotificationSettings.RepeatInterval != nil {
if err := spec.NotificationSettings.RepeatInterval.Clamp(); err != nil {
return err
}
}
}
for k, expr := range spec.Expressions {
if expr.RelativeTimeRange != nil {
if err := expr.RelativeTimeRange.From.Clamp(); err != nil {
return err
}
if err := expr.RelativeTimeRange.To.Clamp(); err != nil {
return err
}
spec.Expressions[k] = expr
}
}
return nil
}

View File

@@ -1,10 +1,22 @@
package v0alpha1
import (
"fmt"
"time"
prom_model "github.com/prometheus/common/model"
)
const (
InternalPrefix = "grafana.com/"
GroupLabelKey = InternalPrefix + "group"
GroupIndexLabelKey = GroupLabelKey + "-index"
ProvenanceStatusAnnotationKey = InternalPrefix + "provenance"
// Copy of the max title length used in legacy validation path
AlertRuleMaxTitleLength = 190
// Annotation key used to store the folder UID on resources
FolderAnnotationKey = "grafana.app/folder"
FolderLabelKey = FolderAnnotationKey
)
const (
@@ -15,3 +27,20 @@ const (
var (
AcceptedProvenanceStatuses = []string{ProvenanceStatusNone, ProvenanceStatusAPI}
)
func ToDuration(s string) (time.Duration, error) {
promDuration, err := prom_model.ParseDuration(s)
if err != nil {
return 0, fmt.Errorf("invalid duration format: %w", err)
}
return time.Duration(promDuration), nil
}
// Convert the string duration to the longest valid Prometheus duration format (e.g., "60s" -> "1m")
func ClampDuration(s string) (string, error) {
promDuration, err := prom_model.ParseDuration(s)
if err != nil {
return "", fmt.Errorf("invalid duration format: %w", err)
}
return promDuration.String(), nil
}

View File

@@ -3,6 +3,7 @@ package v0alpha1
import (
"fmt"
"slices"
"time"
)
func (o *RecordingRule) GetProvenanceStatus() string {
@@ -27,4 +28,47 @@ func (o *RecordingRule) SetProvenanceStatus(status string) (err error) {
return
}
// TODO: add duration clamping for the field types RecordingRulePromDurationWMillis and RecordingRulePromDuration
func (d *RecordingRulePromDuration) ToDuration() (time.Duration, error) {
return ToDuration(string(*d))
}
func (d *RecordingRulePromDurationWMillis) ToDuration() (time.Duration, error) {
return ToDuration(string(*d))
}
func (d *RecordingRulePromDuration) Clamp() error {
clampedDuration, err := ClampDuration(string(*d))
if err != nil {
return err
}
*d = RecordingRulePromDuration(clampedDuration)
return nil
}
func (d *RecordingRulePromDurationWMillis) Clamp() error {
clampedDuration, err := ClampDuration(string(*d))
if err != nil {
return err
}
*d = RecordingRulePromDurationWMillis(clampedDuration)
return nil
}
func (spec *RecordingRuleSpec) ClampDurations() error {
// clamp all duration fields
if err := spec.Trigger.Interval.Clamp(); err != nil {
return err
}
for k, expr := range spec.Expressions {
if expr.RelativeTimeRange != nil {
if err := expr.RelativeTimeRange.From.Clamp(); err != nil {
return err
}
if err := expr.RelativeTimeRange.To.Clamp(); err != nil {
return err
}
spec.Expressions[k] = expr
}
}
return nil
}

View File

@@ -42,7 +42,21 @@ var appManifestData = app.ManifestData{
Plural: "AlertRules",
Scope: "Namespaced",
Conversion: false,
Schema: &versionSchemaAlertRulev0alpha1,
Admission: &app.AdmissionCapabilities{
Validation: &app.ValidationCapability{
Operations: []app.AdmissionOperation{
app.AdmissionOperationCreate,
app.AdmissionOperationUpdate,
},
},
Mutation: &app.MutationCapability{
Operations: []app.AdmissionOperation{
app.AdmissionOperationCreate,
app.AdmissionOperationUpdate,
},
},
},
Schema: &versionSchemaAlertRulev0alpha1,
SelectableFields: []string{
"spec.title",
"spec.paused",
@@ -57,7 +71,21 @@ var appManifestData = app.ManifestData{
Plural: "RecordingRules",
Scope: "Namespaced",
Conversion: false,
Schema: &versionSchemaRecordingRulev0alpha1,
Admission: &app.AdmissionCapabilities{
Validation: &app.ValidationCapability{
Operations: []app.AdmissionOperation{
app.AdmissionOperationCreate,
app.AdmissionOperationUpdate,
},
},
Mutation: &app.MutationCapability{
Operations: []app.AdmissionOperation{
app.AdmissionOperationCreate,
app.AdmissionOperationUpdate,
},
},
},
Schema: &versionSchemaRecordingRulev0alpha1,
SelectableFields: []string{
"spec.title",
"spec.paused",

View File

@@ -0,0 +1,45 @@
package alertrule
import (
"context"
"github.com/grafana/grafana-app-sdk/app"
"github.com/grafana/grafana-app-sdk/simple"
v1 "github.com/grafana/grafana/apps/alerting/rules/pkg/apis/alerting/v0alpha1"
"github.com/grafana/grafana/apps/alerting/rules/pkg/app/config"
)
func NewMutator(cfg config.RuntimeConfig) *simple.Mutator {
return &simple.Mutator{
MutateFunc: func(ctx context.Context, req *app.AdmissionRequest) (*app.MutatingResponse, error) {
// Mutate folder label to match folder UID from annotation
r, ok := req.Object.(*v1.AlertRule)
if !ok || r == nil {
// Nothing to do or wrong type; no mutation
return nil, nil
}
// Read folder UID from annotation
folderUID := ""
if r.Annotations != nil {
folderUID = r.Annotations[v1.FolderAnnotationKey]
}
// Ensure labels map exists and set the folder label if folderUID is present
if folderUID != "" {
if r.Labels == nil {
r.Labels = make(map[string]string)
}
// Maintain folder metadata label for downstream systems (alertmanager grouping etc.)
r.Labels[v1.FolderLabelKey] = folderUID
}
// clamp all duration fields
if err := r.Spec.ClampDurations(); err != nil {
return nil, err
}
return &app.MutatingResponse{UpdatedObject: r}, nil
},
}
}

View File

@@ -0,0 +1,123 @@
package alertrule
import (
"context"
"fmt"
"slices"
"strconv"
"time"
"github.com/grafana/grafana-app-sdk/app"
"github.com/grafana/grafana-app-sdk/resource"
"github.com/grafana/grafana-app-sdk/simple"
model "github.com/grafana/grafana/apps/alerting/rules/pkg/apis/alerting/v0alpha1"
"github.com/grafana/grafana/apps/alerting/rules/pkg/app/config"
"github.com/grafana/grafana/apps/alerting/rules/pkg/app/util"
prom_model "github.com/prometheus/common/model"
)
func NewValidator(cfg config.RuntimeConfig) *simple.Validator {
return &simple.Validator{
ValidateFunc: func(ctx context.Context, req *app.AdmissionRequest) error {
// Cast to specific type
r, ok := req.Object.(*model.AlertRule)
if !ok {
return fmt.Errorf("object is not of type *v0alpha1.AlertRule")
}
// 1) Validate provenance status annotation
sourceProv := r.GetProvenanceStatus()
if !slices.Contains(model.AcceptedProvenanceStatuses, sourceProv) {
return fmt.Errorf("invalid provenance status: %s", sourceProv)
}
// 2) Validate group labels rules
group := r.Labels[model.GroupLabelKey]
groupIndexStr := r.Labels[model.GroupIndexLabelKey]
if req.Action == resource.AdmissionActionCreate {
if group != "" || groupIndexStr != "" {
return fmt.Errorf("cannot set group when creating alert rule")
}
}
if group != "" { // if group is set, group-index must be set and numeric
if groupIndexStr == "" {
return fmt.Errorf("%s must be set when %s is set", model.GroupIndexLabelKey, model.GroupLabelKey)
}
if _, err := strconv.Atoi(groupIndexStr); err != nil {
return fmt.Errorf("invalid %s: %w", model.GroupIndexLabelKey, err)
}
}
// 3) Validate folder is set and exists
// Read folder UID directly from annotations
folderUID := ""
if r.Annotations != nil {
folderUID = r.Annotations[model.FolderAnnotationKey]
}
if folderUID == "" {
return fmt.Errorf("folder is required")
}
if cfg.FolderValidator != nil {
ok, verr := cfg.FolderValidator(ctx, folderUID)
if verr != nil {
return fmt.Errorf("failed to validate folder: %w", verr)
}
if !ok {
return fmt.Errorf("folder does not exist: %s", folderUID)
}
}
// 4) Validate notification settings receiver if provided
if r.Spec.NotificationSettings != nil && r.Spec.NotificationSettings.Receiver != "" && cfg.NotificationSettingsValidator != nil {
ok, nerr := cfg.NotificationSettingsValidator(ctx, r.Spec.NotificationSettings.Receiver)
if nerr != nil {
return fmt.Errorf("failed to validate notification settings: %w", nerr)
}
if !ok {
return fmt.Errorf("invalid notification receiver: %s", r.Spec.NotificationSettings.Receiver)
}
}
// 5) Enforce max title length
if len(r.Spec.Title) > model.AlertRuleMaxTitleLength {
return fmt.Errorf("alert rule title is too long. Max length is %d", model.AlertRuleMaxTitleLength)
}
// 6) Validate evaluation interval against base interval
if err := util.ValidateInterval(cfg.BaseEvaluationInterval, &r.Spec.Trigger.Interval); err != nil {
return err
}
// 7) Disallow reserved/spec system label keys
if r.Spec.Labels != nil {
for key := range r.Spec.Labels {
if _, bad := cfg.ReservedLabelKeys[key]; bad {
return fmt.Errorf("label key is reserved and cannot be specified: %s", key)
}
}
}
// 8) For and KeepFiringFor must be >= 0 if set
if r.Spec.For != nil {
d, err := prom_model.ParseDuration(*r.Spec.For)
if err != nil {
return fmt.Errorf("invalid 'for' duration: %w", err)
}
if time.Duration(d) < 0 {
return fmt.Errorf("'for' cannot be less than 0")
}
}
if r.Spec.KeepFiringFor != nil {
d, err := prom_model.ParseDuration(*r.Spec.KeepFiringFor)
if err != nil {
return fmt.Errorf("invalid 'keepFiringFor' duration: %w", err)
}
if time.Duration(d) < 0 {
return fmt.Errorf("'keepFiringFor' cannot be less than 0")
}
}
return nil
},
}
}

View File

@@ -6,16 +6,29 @@ import (
"github.com/grafana/grafana-app-sdk/app"
"github.com/grafana/grafana-app-sdk/logging"
"github.com/grafana/grafana-app-sdk/operator"
"github.com/grafana/grafana-app-sdk/resource"
"github.com/grafana/grafana-app-sdk/simple"
"github.com/grafana/grafana/apps/alerting/rules/pkg/apis"
"github.com/grafana/grafana/apps/alerting/rules/pkg/app/alertrule"
"github.com/grafana/grafana/apps/alerting/rules/pkg/app/config"
"github.com/grafana/grafana/apps/alerting/rules/pkg/app/recordingrule"
)
func New(cfg app.Config) (app.App, error) {
managedKinds := make([]simple.AppManagedKind, 0)
runtimeCfg, ok := cfg.SpecificConfig.(config.RuntimeConfig)
if !ok {
return nil, config.ErrInvalidRuntimeConfig
}
for _, kinds := range apis.GetKinds() {
for _, kind := range kinds {
managedKinds = append(managedKinds, simple.AppManagedKind{Kind: kind})
managedKind := simple.AppManagedKind{
Kind: kind,
Validator: buildKindValidator(kind, runtimeCfg),
Mutator: buildKindMutator(kind, runtimeCfg),
}
managedKinds = append(managedKinds, managedKind)
}
}
@@ -44,3 +57,23 @@ func New(cfg app.Config) (app.App, error) {
return a, nil
}
func buildKindValidator(kind resource.Kind, cfg config.RuntimeConfig) *simple.Validator {
switch kind.Kind() {
case "AlertRule":
return alertrule.NewValidator(cfg)
case "RecordingRule":
return recordingrule.NewValidator(cfg)
}
return nil
}
func buildKindMutator(kind resource.Kind, cfg config.RuntimeConfig) *simple.Mutator {
switch kind.Kind() {
case "AlertRule":
return alertrule.NewMutator(cfg)
case "RecordingRule":
return recordingrule.NewMutator(cfg)
}
return nil
}

View File

@@ -0,0 +1,175 @@
package app_test
import (
"context"
"testing"
"time"
appsdk "github.com/grafana/grafana-app-sdk/app"
"github.com/grafana/grafana-app-sdk/resource"
v1 "github.com/grafana/grafana/apps/alerting/rules/pkg/apis/alerting/v0alpha1"
"github.com/grafana/grafana/apps/alerting/rules/pkg/app/alertrule"
"github.com/grafana/grafana/apps/alerting/rules/pkg/app/config"
"github.com/grafana/grafana/apps/alerting/rules/pkg/app/recordingrule"
)
func makeDefaultRuntimeConfig() config.RuntimeConfig {
return config.RuntimeConfig{
FolderValidator: func(ctx context.Context, folderUID string) (bool, error) { return folderUID == "f1", nil },
BaseEvaluationInterval: 60 * time.Second, // seconds
ReservedLabelKeys: map[string]struct{}{"__reserved__": {}, "grafana_folder": {}},
NotificationSettingsValidator: func(ctx context.Context, receiver string) (bool, error) { return receiver == "notif-ok", nil },
}
}
func TestAlertRuleValidation_Success(t *testing.T) {
r := &v1.AlertRule{}
r.SetGroupVersionKind(v1.AlertRuleKind().GroupVersionKind())
r.Name = "uid-1"
r.Namespace = "ns1"
r.Annotations = map[string]string{v1.FolderAnnotationKey: "f1"}
r.Labels = map[string]string{}
r.Spec = v1.AlertRuleSpec{
Title: "ok",
Trigger: v1.AlertRuleIntervalTrigger{Interval: v1.AlertRulePromDuration("60s")},
Expressions: v1.AlertRuleExpressionMap{"A": v1.AlertRuleExpression{Model: map[string]any{"expr": "1"}, Source: boolPtr(true)}},
NoDataState: v1.DefaultNoDataState,
ExecErrState: v1.DefaultExecErrState,
NotificationSettings: &v1.AlertRuleV0alpha1SpecNotificationSettings{Receiver: "notif-ok"},
}
req := &appsdk.AdmissionRequest{Action: resource.AdmissionActionCreate, Object: r}
validator := alertrule.NewValidator(makeDefaultRuntimeConfig())
if err := validator.Validate(context.Background(), req); err != nil {
t.Fatalf("expected success, got error: %v", err)
}
}
func TestAlertRuleValidation_Errors(t *testing.T) {
mk := func(mut func(r *v1.AlertRule)) error {
r := baseAlertRule()
mut(r)
return alertrule.NewValidator(makeDefaultRuntimeConfig()).Validate(context.Background(), &appsdk.AdmissionRequest{Action: resource.AdmissionActionCreate, Object: r})
}
if err := mk(func(r *v1.AlertRule) { r.Annotations = nil }); err == nil {
t.Errorf("want folder required error")
}
if err := mk(func(r *v1.AlertRule) { r.Annotations[v1.FolderAnnotationKey] = "bad" }); err == nil {
t.Errorf("want folder not exist error")
}
if err := mk(func(r *v1.AlertRule) { r.Spec.Trigger.Interval = v1.AlertRulePromDuration("30s") }); err == nil {
t.Errorf("want base interval multiple error")
}
if err := mk(func(r *v1.AlertRule) {
r.Spec.NotificationSettings = &v1.AlertRuleV0alpha1SpecNotificationSettings{Receiver: "bad"}
}); err == nil {
t.Errorf("want invalid receiver error")
}
if err := mk(func(r *v1.AlertRule) { r.Labels[v1.GroupLabelKey] = "grp" }); err == nil {
t.Errorf("want group set on create error")
}
if err := mk(func(r *v1.AlertRule) { r.Spec.For = strPtr("-10s") }); err == nil {
t.Errorf("want for>=0 error")
}
if err := mk(func(r *v1.AlertRule) {
if r.Spec.Labels == nil {
r.Spec.Labels = map[string]v1.AlertRuleTemplateString{}
}
r.Spec.Labels["__reserved__"] = v1.AlertRuleTemplateString("x")
}); err == nil {
t.Errorf("want reserved label key error")
}
}
func baseAlertRule() *v1.AlertRule {
r := &v1.AlertRule{}
r.SetGroupVersionKind(v1.AlertRuleKind().GroupVersionKind())
r.Name = "uid-1"
r.Namespace = "ns1"
r.Annotations = map[string]string{v1.FolderAnnotationKey: "f1"}
r.Labels = map[string]string{}
r.Spec = v1.AlertRuleSpec{
Title: "ok",
Trigger: v1.AlertRuleIntervalTrigger{Interval: v1.AlertRulePromDuration("60s")},
Expressions: v1.AlertRuleExpressionMap{"A": v1.AlertRuleExpression{Model: map[string]any{"expr": "1"}, Source: boolPtr(true)}},
NoDataState: v1.DefaultNoDataState,
ExecErrState: v1.DefaultExecErrState,
}
return r
}
func TestRecordingRuleValidation_Success(t *testing.T) {
r := &v1.RecordingRule{}
r.SetGroupVersionKind(v1.RecordingRuleKind().GroupVersionKind())
r.Name = "uid-2"
r.Namespace = "ns1"
r.Annotations = map[string]string{v1.FolderAnnotationKey: "f1"}
r.Labels = map[string]string{}
r.Spec = v1.RecordingRuleSpec{
Title: "ok",
Trigger: v1.RecordingRuleIntervalTrigger{Interval: v1.RecordingRulePromDuration("60s")},
Expressions: v1.RecordingRuleExpressionMap{"A": v1.RecordingRuleExpression{Model: map[string]any{"expr": "1"}, Source: boolPtr(true)}},
Metric: "test_metric",
TargetDatasourceUID: "ds1",
}
req := &appsdk.AdmissionRequest{Action: resource.AdmissionActionCreate, Object: r}
validator := recordingrule.NewValidator(makeDefaultRuntimeConfig())
if err := validator.Validate(context.Background(), req); err != nil {
t.Fatalf("expected success, got error: %v", err)
}
}
func TestRecordingRuleValidation_Errors(t *testing.T) {
mk := func(mut func(r *v1.RecordingRule)) error {
r := baseRecordingRule()
mut(r)
return recordingrule.NewValidator(makeDefaultRuntimeConfig()).Validate(context.Background(), &appsdk.AdmissionRequest{Action: resource.AdmissionActionCreate, Object: r})
}
if err := mk(func(r *v1.RecordingRule) { r.Annotations = nil }); err == nil {
t.Errorf("want folder required error")
}
if err := mk(func(r *v1.RecordingRule) { r.Annotations[v1.FolderAnnotationKey] = "bad" }); err == nil {
t.Errorf("want folder not exist error")
}
if err := mk(func(r *v1.RecordingRule) { r.Spec.Trigger.Interval = v1.RecordingRulePromDuration("30s") }); err == nil {
t.Errorf("want base interval multiple error")
}
if err := mk(func(r *v1.RecordingRule) { r.Labels[v1.GroupLabelKey] = "grp" }); err == nil {
t.Errorf("want group set on create error")
}
if err := mk(func(r *v1.RecordingRule) { r.Spec.Metric = "" }); err == nil {
t.Errorf("want metric required error")
}
if err := mk(func(r *v1.RecordingRule) {
if r.Spec.Labels == nil {
r.Spec.Labels = map[string]v1.RecordingRuleTemplateString{}
}
r.Spec.Labels["__reserved__"] = v1.RecordingRuleTemplateString("x")
}); err == nil {
t.Errorf("want reserved label key error")
}
}
func baseRecordingRule() *v1.RecordingRule {
r := &v1.RecordingRule{}
r.SetGroupVersionKind(v1.RecordingRuleKind().GroupVersionKind())
r.Name = "uid-1"
r.Namespace = "ns1"
r.Annotations = map[string]string{v1.FolderAnnotationKey: "f1"}
r.Labels = map[string]string{}
r.Spec = v1.RecordingRuleSpec{
Title: "ok",
Trigger: v1.RecordingRuleIntervalTrigger{Interval: v1.RecordingRulePromDuration("60s")},
Expressions: v1.RecordingRuleExpressionMap{"A": v1.RecordingRuleExpression{Model: map[string]any{"expr": "1"}, Source: boolPtr(true)}},
Metric: "test_metric",
TargetDatasourceUID: "ds1",
}
return r
}
func boolPtr(b bool) *bool { return &b }
func strPtr(s string) *string { return &s }

View File

@@ -0,0 +1,22 @@
package config
import (
"context"
"errors"
"time"
)
var (
ErrInvalidRuntimeConfig = errors.New("invalid runtime config provided to alerting/rules app")
)
// RuntimeConfig holds configuration values needed at runtime by the alerting/rules app from the running Grafana instance.
type RuntimeConfig struct {
// function to check folder existence given its uid
FolderValidator func(ctx context.Context, folderUID string) (bool, error)
// base evaluation interval
BaseEvaluationInterval time.Duration
// set of strings which are illegal for label keys on rules
ReservedLabelKeys map[string]struct{}
NotificationSettingsValidator func(ctx context.Context, receiver string) (bool, error)
}

View File

@@ -0,0 +1,37 @@
package recordingrule
import (
"context"
"github.com/grafana/grafana-app-sdk/app"
"github.com/grafana/grafana-app-sdk/simple"
v1 "github.com/grafana/grafana/apps/alerting/rules/pkg/apis/alerting/v0alpha1"
"github.com/grafana/grafana/apps/alerting/rules/pkg/app/config"
)
func NewMutator(cfg config.RuntimeConfig) *simple.Mutator {
return &simple.Mutator{
MutateFunc: func(ctx context.Context, req *app.AdmissionRequest) (*app.MutatingResponse, error) {
r, ok := req.Object.(*v1.RecordingRule)
if !ok || r == nil {
return nil, nil
}
folderUID := ""
if r.Annotations != nil {
folderUID = r.Annotations[v1.FolderAnnotationKey]
}
if folderUID != "" {
if r.Labels == nil {
r.Labels = make(map[string]string)
}
r.Labels[v1.FolderLabelKey] = folderUID
}
if err := r.Spec.ClampDurations(); err != nil {
return nil, err
}
return &app.MutatingResponse{UpdatedObject: r}, nil
},
}
}

View File

@@ -0,0 +1,95 @@
package recordingrule
import (
"context"
"fmt"
"slices"
"strconv"
"github.com/grafana/grafana-app-sdk/app"
"github.com/grafana/grafana-app-sdk/resource"
"github.com/grafana/grafana-app-sdk/simple"
model "github.com/grafana/grafana/apps/alerting/rules/pkg/apis/alerting/v0alpha1"
"github.com/grafana/grafana/apps/alerting/rules/pkg/app/config"
"github.com/grafana/grafana/apps/alerting/rules/pkg/app/util"
prom_model "github.com/prometheus/common/model"
)
func NewValidator(cfg config.RuntimeConfig) *simple.Validator {
return &simple.Validator{
ValidateFunc: func(ctx context.Context, req *app.AdmissionRequest) error {
// Cast to specific type
r, ok := req.Object.(*model.RecordingRule)
if !ok {
return fmt.Errorf("object is not of type *v0alpha1.RecordingRule")
}
sourceProv := r.GetProvenanceStatus()
if !slices.Contains(model.AcceptedProvenanceStatuses, sourceProv) {
return fmt.Errorf("invalid provenance status: %s", sourceProv)
}
group := r.Labels[model.GroupLabelKey]
groupIndexStr := r.Labels[model.GroupIndexLabelKey]
if req.Action == resource.AdmissionActionCreate {
if group != "" || groupIndexStr != "" {
return fmt.Errorf("cannot set group when creating recording rule")
}
}
if group != "" {
if groupIndexStr == "" {
return fmt.Errorf("%s must be set when %s is set", model.GroupIndexLabelKey, model.GroupLabelKey)
}
if _, err := strconv.Atoi(groupIndexStr); err != nil {
return fmt.Errorf("invalid %s: %w", model.GroupIndexLabelKey, err)
}
}
folderUID := ""
if r.Annotations != nil {
folderUID = r.Annotations[model.FolderAnnotationKey]
}
if folderUID == "" {
return fmt.Errorf("folder is required")
}
if cfg.FolderValidator != nil {
ok, verr := cfg.FolderValidator(ctx, folderUID)
if verr != nil {
return fmt.Errorf("failed to validate folder: %w", verr)
}
if !ok {
return fmt.Errorf("folder does not exist: %s", folderUID)
}
}
if len(r.Spec.Title) > model.AlertRuleMaxTitleLength {
return fmt.Errorf("recording rule title is too long. Max length is %d", model.AlertRuleMaxTitleLength)
}
if err := util.ValidateInterval(cfg.BaseEvaluationInterval, &r.Spec.Trigger.Interval); err != nil {
return err
}
if r.Spec.Labels != nil {
for key := range r.Spec.Labels {
if _, bad := cfg.ReservedLabelKeys[key]; bad {
return fmt.Errorf("label key is reserved and cannot be specified: %s", key)
}
}
}
if r.Spec.Metric == "" {
return fmt.Errorf("metric must be specified")
}
metric := prom_model.LabelValue(r.Spec.Metric)
if !metric.IsValid() {
return fmt.Errorf("metric contains invalid characters")
}
if !prom_model.IsValidMetricName(metric) { // nolint:staticcheck
return fmt.Errorf("invalid metric name")
}
return nil
},
}
}

View File

@@ -0,0 +1,27 @@
package util
import (
"fmt"
"time"
)
type DurationLike interface {
ToDuration() (time.Duration, error)
}
func ValidateInterval(baseInterval time.Duration, d DurationLike) error {
interval, err := d.ToDuration()
if err != nil {
return fmt.Errorf("invalid trigger interval: %w", err)
}
// Ensure interval is positive and an integer multiple of BaseEvaluationInterval (if provided)
if interval <= 0 {
return fmt.Errorf("trigger interval must be greater than 0")
}
if baseInterval > 0 {
if (interval % baseInterval) != 0 {
return fmt.Errorf("trigger interval must be a multiple of base evaluation interval (%s)", baseInterval.String())
}
}
return nil
}

View File

@@ -0,0 +1,172 @@
package jobs
import (
apierrors "k8s.io/apimachinery/pkg/api/errors"
"k8s.io/apimachinery/pkg/util/validation/field"
provisioning "github.com/grafana/grafana/apps/provisioning/pkg/apis/provisioning/v0alpha1"
"github.com/grafana/grafana/apps/provisioning/pkg/repository/git"
"github.com/grafana/grafana/apps/provisioning/pkg/safepath"
)
// ValidateJob performs validation on the Job specification and returns an error if validation fails
func ValidateJob(job *provisioning.Job) error {
list := field.ErrorList{}
// Validate action is specified
if job.Spec.Action == "" {
list = append(list, field.Required(field.NewPath("spec", "action"), "action must be specified"))
return toError(job.Name, list) // Early return since we can't validate further without knowing the action
}
// Validate repository is specified
if job.Spec.Repository == "" {
list = append(list, field.Required(field.NewPath("spec", "repository"), "repository must be specified"))
}
// Validate action-specific options
switch job.Spec.Action {
case provisioning.JobActionPull:
if job.Spec.Pull == nil {
list = append(list, field.Required(field.NewPath("spec", "pull"), "pull options required for pull action"))
}
// Pull options are simple, just incremental bool - no further validation needed
case provisioning.JobActionPush:
if job.Spec.Push == nil {
list = append(list, field.Required(field.NewPath("spec", "push"), "push options required for push action"))
} else {
list = append(list, validateExportJobOptions(job.Spec.Push)...)
}
case provisioning.JobActionPullRequest:
if job.Spec.PullRequest == nil {
list = append(list, field.Required(field.NewPath("spec", "pr"), "pull request options required for pr action"))
}
// PullRequest options are mostly informational - no strict validation needed
case provisioning.JobActionMigrate:
if job.Spec.Migrate == nil {
list = append(list, field.Required(field.NewPath("spec", "migrate"), "migrate options required for migrate action"))
}
// Migrate options are simple - no further validation needed
case provisioning.JobActionDelete:
if job.Spec.Delete == nil {
list = append(list, field.Required(field.NewPath("spec", "delete"), "delete options required for delete action"))
} else {
list = append(list, validateDeleteJobOptions(job.Spec.Delete)...)
}
case provisioning.JobActionMove:
if job.Spec.Move == nil {
list = append(list, field.Required(field.NewPath("spec", "move"), "move options required for move action"))
} else {
list = append(list, validateMoveJobOptions(job.Spec.Move)...)
}
default:
list = append(list, field.Invalid(field.NewPath("spec", "action"), job.Spec.Action, "invalid action"))
}
return toError(job.Name, list)
}
// toError converts a field.ErrorList to an error, returning nil if the list is empty
func toError(name string, list field.ErrorList) error {
if len(list) == 0 {
return nil
}
return apierrors.NewInvalid(
provisioning.JobResourceInfo.GroupVersionKind().GroupKind(),
name, list)
}
// validateExportJobOptions validates export (push) job options
func validateExportJobOptions(opts *provisioning.ExportJobOptions) field.ErrorList {
list := field.ErrorList{}
// Validate branch name if specified
if opts.Branch != "" {
if !git.IsValidGitBranchName(opts.Branch) {
list = append(list, field.Invalid(field.NewPath("spec", "push", "branch"), opts.Branch, "invalid git branch name"))
}
}
// Validate path if specified
if opts.Path != "" {
if err := safepath.IsSafe(opts.Path); err != nil {
list = append(list, field.Invalid(field.NewPath("spec", "push", "path"), opts.Path, err.Error()))
}
}
return list
}
// validateDeleteJobOptions validates delete job options
func validateDeleteJobOptions(opts *provisioning.DeleteJobOptions) field.ErrorList {
list := field.ErrorList{}
// At least one of paths or resources must be specified
if len(opts.Paths) == 0 && len(opts.Resources) == 0 {
list = append(list, field.Required(field.NewPath("spec", "delete"), "at least one path or resource must be specified"))
return list
}
// Validate paths
for i, p := range opts.Paths {
if err := safepath.IsSafe(p); err != nil {
list = append(list, field.Invalid(field.NewPath("spec", "delete", "paths").Index(i), p, err.Error()))
}
}
// Validate resources
for i, r := range opts.Resources {
if r.Name == "" {
list = append(list, field.Required(field.NewPath("spec", "delete", "resources").Index(i).Child("name"), "resource name is required"))
}
if r.Kind == "" {
list = append(list, field.Required(field.NewPath("spec", "delete", "resources").Index(i).Child("kind"), "resource kind is required"))
}
}
return list
}
// validateMoveJobOptions validates move job options
func validateMoveJobOptions(opts *provisioning.MoveJobOptions) field.ErrorList {
list := field.ErrorList{}
// At least one of paths or resources must be specified
if len(opts.Paths) == 0 && len(opts.Resources) == 0 {
list = append(list, field.Required(field.NewPath("spec", "move"), "at least one path or resource must be specified"))
return list
}
// Target path is required
if opts.TargetPath == "" {
list = append(list, field.Required(field.NewPath("spec", "move", "targetPath"), "target path is required"))
} else {
if err := safepath.IsSafe(opts.TargetPath); err != nil {
list = append(list, field.Invalid(field.NewPath("spec", "move", "targetPath"), opts.TargetPath, err.Error()))
}
}
// Validate source paths
for i, p := range opts.Paths {
if err := safepath.IsSafe(p); err != nil {
list = append(list, field.Invalid(field.NewPath("spec", "move", "paths").Index(i), p, err.Error()))
}
}
// Validate resources
for i, r := range opts.Resources {
if r.Name == "" {
list = append(list, field.Required(field.NewPath("spec", "move", "resources").Index(i).Child("name"), "resource name is required"))
}
if r.Kind == "" {
list = append(list, field.Required(field.NewPath("spec", "move", "resources").Index(i).Child("kind"), "resource kind is required"))
}
}
return list
}

View File

@@ -0,0 +1,593 @@
package jobs
import (
"testing"
"github.com/stretchr/testify/require"
metav1 "k8s.io/apimachinery/pkg/apis/meta/v1"
provisioning "github.com/grafana/grafana/apps/provisioning/pkg/apis/provisioning/v0alpha1"
)
func TestValidateJob(t *testing.T) {
tests := []struct {
name string
job *provisioning.Job
wantErr bool
validateError func(t *testing.T, err error)
}{
{
name: "valid pull job",
job: &provisioning.Job{
ObjectMeta: metav1.ObjectMeta{
Name: "test-job",
},
Spec: provisioning.JobSpec{
Action: provisioning.JobActionPull,
Repository: "test-repo",
Pull: &provisioning.SyncJobOptions{
Incremental: true,
},
},
},
wantErr: false,
},
{
name: "missing action",
job: &provisioning.Job{
ObjectMeta: metav1.ObjectMeta{
Name: "test-job",
},
Spec: provisioning.JobSpec{
Repository: "test-repo",
},
},
wantErr: true,
validateError: func(t *testing.T, err error) {
require.Contains(t, err.Error(), "spec.action: Required value")
},
},
{
name: "invalid action",
job: &provisioning.Job{
ObjectMeta: metav1.ObjectMeta{
Name: "test-job",
},
Spec: provisioning.JobSpec{
Action: provisioning.JobAction("invalid"),
Repository: "test-repo",
},
},
wantErr: true,
validateError: func(t *testing.T, err error) {
require.Contains(t, err.Error(), "spec.action: Invalid value")
},
},
{
name: "missing repository",
job: &provisioning.Job{
ObjectMeta: metav1.ObjectMeta{
Name: "test-job",
},
Spec: provisioning.JobSpec{
Action: provisioning.JobActionPull,
Pull: &provisioning.SyncJobOptions{
Incremental: true,
},
},
},
wantErr: true,
validateError: func(t *testing.T, err error) {
require.Contains(t, err.Error(), "spec.repository: Required value")
},
},
{
name: "pull action without pull options",
job: &provisioning.Job{
ObjectMeta: metav1.ObjectMeta{
Name: "test-job",
},
Spec: provisioning.JobSpec{
Action: provisioning.JobActionPull,
Repository: "test-repo",
},
},
wantErr: true,
validateError: func(t *testing.T, err error) {
require.Contains(t, err.Error(), "spec.pull: Required value")
},
},
{
name: "push action without push options",
job: &provisioning.Job{
ObjectMeta: metav1.ObjectMeta{
Name: "test-job",
},
Spec: provisioning.JobSpec{
Action: provisioning.JobActionPush,
Repository: "test-repo",
},
},
wantErr: true,
validateError: func(t *testing.T, err error) {
require.Contains(t, err.Error(), "spec.push: Required value")
},
},
{
name: "valid push job with valid branch",
job: &provisioning.Job{
ObjectMeta: metav1.ObjectMeta{
Name: "test-job",
},
Spec: provisioning.JobSpec{
Action: provisioning.JobActionPush,
Repository: "test-repo",
Push: &provisioning.ExportJobOptions{
Branch: "main",
Message: "Test commit",
},
},
},
wantErr: false,
},
{
name: "push job with invalid branch name",
job: &provisioning.Job{
ObjectMeta: metav1.ObjectMeta{
Name: "test-job",
},
Spec: provisioning.JobSpec{
Action: provisioning.JobActionPush,
Repository: "test-repo",
Push: &provisioning.ExportJobOptions{
Branch: "feature..branch", // Invalid: contains consecutive dots
Message: "Test commit",
},
},
},
wantErr: true,
validateError: func(t *testing.T, err error) {
require.Contains(t, err.Error(), "spec.push.branch")
require.Contains(t, err.Error(), "invalid git branch name")
},
},
{
name: "push job with invalid path",
job: &provisioning.Job{
ObjectMeta: metav1.ObjectMeta{
Name: "test-job",
},
Spec: provisioning.JobSpec{
Action: provisioning.JobActionPush,
Repository: "test-repo",
Push: &provisioning.ExportJobOptions{
Path: "../../../etc/passwd", // Invalid: path traversal
Message: "Test commit",
},
},
},
wantErr: true,
validateError: func(t *testing.T, err error) {
require.Contains(t, err.Error(), "spec.push.path")
},
},
{
name: "delete action without options",
job: &provisioning.Job{
ObjectMeta: metav1.ObjectMeta{
Name: "test-job",
},
Spec: provisioning.JobSpec{
Action: provisioning.JobActionDelete,
Repository: "test-repo",
},
},
wantErr: true,
validateError: func(t *testing.T, err error) {
require.Contains(t, err.Error(), "spec.delete: Required value")
},
},
{
name: "delete action without paths or resources",
job: &provisioning.Job{
ObjectMeta: metav1.ObjectMeta{
Name: "test-job",
},
Spec: provisioning.JobSpec{
Action: provisioning.JobActionDelete,
Repository: "test-repo",
Delete: &provisioning.DeleteJobOptions{},
},
},
wantErr: true,
validateError: func(t *testing.T, err error) {
require.Contains(t, err.Error(), "at least one path or resource must be specified")
},
},
{
name: "valid delete action with paths",
job: &provisioning.Job{
ObjectMeta: metav1.ObjectMeta{
Name: "test-job",
},
Spec: provisioning.JobSpec{
Action: provisioning.JobActionDelete,
Repository: "test-repo",
Delete: &provisioning.DeleteJobOptions{
Paths: []string{"dashboard.json", "folder/other.json"},
},
},
},
wantErr: false,
},
{
name: "valid delete action with resources",
job: &provisioning.Job{
ObjectMeta: metav1.ObjectMeta{
Name: "test-job",
},
Spec: provisioning.JobSpec{
Action: provisioning.JobActionDelete,
Repository: "test-repo",
Delete: &provisioning.DeleteJobOptions{
Resources: []provisioning.ResourceRef{
{
Name: "my-dashboard",
Kind: "Dashboard",
},
},
},
},
},
wantErr: false,
},
{
name: "delete action with invalid path",
job: &provisioning.Job{
ObjectMeta: metav1.ObjectMeta{
Name: "test-job",
},
Spec: provisioning.JobSpec{
Action: provisioning.JobActionDelete,
Repository: "test-repo",
Delete: &provisioning.DeleteJobOptions{
Paths: []string{"../../etc/passwd"}, // Invalid: path traversal
},
},
},
wantErr: true,
validateError: func(t *testing.T, err error) {
require.Contains(t, err.Error(), "spec.delete.paths[0]")
},
},
{
name: "delete action with resource missing name",
job: &provisioning.Job{
ObjectMeta: metav1.ObjectMeta{
Name: "test-job",
},
Spec: provisioning.JobSpec{
Action: provisioning.JobActionDelete,
Repository: "test-repo",
Delete: &provisioning.DeleteJobOptions{
Resources: []provisioning.ResourceRef{
{
Kind: "Dashboard",
},
},
},
},
},
wantErr: true,
validateError: func(t *testing.T, err error) {
require.Contains(t, err.Error(), "spec.delete.resources[0].name")
},
},
{
name: "move action without options",
job: &provisioning.Job{
ObjectMeta: metav1.ObjectMeta{
Name: "test-job",
},
Spec: provisioning.JobSpec{
Action: provisioning.JobActionMove,
Repository: "test-repo",
},
},
wantErr: true,
validateError: func(t *testing.T, err error) {
require.Contains(t, err.Error(), "spec.move: Required value")
},
},
{
name: "move action without paths or resources",
job: &provisioning.Job{
ObjectMeta: metav1.ObjectMeta{
Name: "test-job",
},
Spec: provisioning.JobSpec{
Action: provisioning.JobActionMove,
Repository: "test-repo",
Move: &provisioning.MoveJobOptions{
TargetPath: "new-location/",
},
},
},
wantErr: true,
validateError: func(t *testing.T, err error) {
require.Contains(t, err.Error(), "at least one path or resource must be specified")
},
},
{
name: "move action without target path",
job: &provisioning.Job{
ObjectMeta: metav1.ObjectMeta{
Name: "test-job",
},
Spec: provisioning.JobSpec{
Action: provisioning.JobActionMove,
Repository: "test-repo",
Move: &provisioning.MoveJobOptions{
Paths: []string{"dashboard.json"},
},
},
},
wantErr: true,
validateError: func(t *testing.T, err error) {
require.Contains(t, err.Error(), "spec.move.targetPath: Required value")
},
},
{
name: "valid move action",
job: &provisioning.Job{
ObjectMeta: metav1.ObjectMeta{
Name: "test-job",
},
Spec: provisioning.JobSpec{
Action: provisioning.JobActionMove,
Repository: "test-repo",
Move: &provisioning.MoveJobOptions{
Paths: []string{"old-location/dashboard.json"},
TargetPath: "new-location/",
},
},
},
wantErr: false,
},
{
name: "move action with invalid target path",
job: &provisioning.Job{
ObjectMeta: metav1.ObjectMeta{
Name: "test-job",
},
Spec: provisioning.JobSpec{
Action: provisioning.JobActionMove,
Repository: "test-repo",
Move: &provisioning.MoveJobOptions{
Paths: []string{"dashboard.json"},
TargetPath: "../../../etc/", // Invalid: path traversal
},
},
},
wantErr: true,
validateError: func(t *testing.T, err error) {
require.Contains(t, err.Error(), "spec.move.targetPath")
},
},
{
name: "valid migrate job",
job: &provisioning.Job{
ObjectMeta: metav1.ObjectMeta{
Name: "test-job",
},
Spec: provisioning.JobSpec{
Action: provisioning.JobActionMigrate,
Repository: "test-repo",
Migrate: &provisioning.MigrateJobOptions{
History: true,
Message: "Migrate from legacy",
},
},
},
wantErr: false,
},
{
name: "migrate action without migrate options",
job: &provisioning.Job{
ObjectMeta: metav1.ObjectMeta{
Name: "test-job",
},
Spec: provisioning.JobSpec{
Action: provisioning.JobActionMigrate,
Repository: "test-repo",
},
},
wantErr: true,
validateError: func(t *testing.T, err error) {
require.Contains(t, err.Error(), "spec.migrate: Required value")
},
},
{
name: "valid pr job",
job: &provisioning.Job{
ObjectMeta: metav1.ObjectMeta{
Name: "test-job",
},
Spec: provisioning.JobSpec{
Action: provisioning.JobActionPullRequest,
Repository: "test-repo",
PullRequest: &provisioning.PullRequestJobOptions{
PR: 123,
Ref: "refs/pull/123/head",
},
},
},
wantErr: false,
},
{
name: "delete action with resource missing kind",
job: &provisioning.Job{
ObjectMeta: metav1.ObjectMeta{
Name: "test-job",
},
Spec: provisioning.JobSpec{
Action: provisioning.JobActionDelete,
Repository: "test-repo",
Delete: &provisioning.DeleteJobOptions{
Resources: []provisioning.ResourceRef{
{
Name: "my-dashboard",
},
},
},
},
},
wantErr: true,
validateError: func(t *testing.T, err error) {
require.Contains(t, err.Error(), "spec.delete.resources[0].kind")
},
},
{
name: "move action with valid resources",
job: &provisioning.Job{
ObjectMeta: metav1.ObjectMeta{
Name: "test-job",
},
Spec: provisioning.JobSpec{
Action: provisioning.JobActionMove,
Repository: "test-repo",
Move: &provisioning.MoveJobOptions{
Resources: []provisioning.ResourceRef{
{
Name: "my-dashboard",
Kind: "Dashboard",
},
},
TargetPath: "new-location/",
},
},
},
wantErr: false,
},
{
name: "move action with resource missing kind",
job: &provisioning.Job{
ObjectMeta: metav1.ObjectMeta{
Name: "test-job",
},
Spec: provisioning.JobSpec{
Action: provisioning.JobActionMove,
Repository: "test-repo",
Move: &provisioning.MoveJobOptions{
Resources: []provisioning.ResourceRef{
{
Name: "my-dashboard",
},
},
TargetPath: "new-location/",
},
},
},
wantErr: true,
validateError: func(t *testing.T, err error) {
require.Contains(t, err.Error(), "spec.move.resources[0].kind")
},
},
{
name: "move action with both paths and resources",
job: &provisioning.Job{
ObjectMeta: metav1.ObjectMeta{
Name: "test-job",
},
Spec: provisioning.JobSpec{
Action: provisioning.JobActionMove,
Repository: "test-repo",
Move: &provisioning.MoveJobOptions{
Paths: []string{"dashboard.json"},
Resources: []provisioning.ResourceRef{
{
Name: "my-dashboard",
Kind: "Dashboard",
},
},
TargetPath: "new-location/",
},
},
},
wantErr: false,
},
{
name: "move action with invalid source path",
job: &provisioning.Job{
ObjectMeta: metav1.ObjectMeta{
Name: "test-job",
},
Spec: provisioning.JobSpec{
Action: provisioning.JobActionMove,
Repository: "test-repo",
Move: &provisioning.MoveJobOptions{
Paths: []string{"../invalid/path"},
TargetPath: "valid/target/",
},
},
},
wantErr: true,
validateError: func(t *testing.T, err error) {
require.Contains(t, err.Error(), "spec.move.paths[0]")
},
},
{
name: "delete action with both paths and resources",
job: &provisioning.Job{
ObjectMeta: metav1.ObjectMeta{
Name: "test-job",
},
Spec: provisioning.JobSpec{
Action: provisioning.JobActionDelete,
Repository: "test-repo",
Delete: &provisioning.DeleteJobOptions{
Paths: []string{"dashboard.json"},
Resources: []provisioning.ResourceRef{
{
Name: "my-dashboard",
Kind: "Dashboard",
},
},
},
},
},
wantErr: false,
},
{
name: "push action with valid path",
job: &provisioning.Job{
ObjectMeta: metav1.ObjectMeta{
Name: "test-job",
},
Spec: provisioning.JobSpec{
Action: provisioning.JobActionPush,
Repository: "test-repo",
Push: &provisioning.ExportJobOptions{
Path: "some/valid/path",
Message: "Test commit",
},
},
},
wantErr: false,
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
err := ValidateJob(tt.job)
if tt.wantErr {
require.Error(t, err)
if tt.validateError != nil {
tt.validateError(t, err)
}
} else {
require.NoError(t, err)
}
})
}
}

View File

@@ -0,0 +1,154 @@
---
aliases:
labels:
products:
- cloud
- enterprise
- oss
menuTitle: Concepts
title: Data sources, plugins and integrations
weight: 70
refs:
data-source-management:
- pattern: /docs/grafana/
destination: /docs/grafana/<GRAFANA_VERSION>/administration/data-source-management/
- pattern: /docs/grafana-cloud/
destination: /docs/grafana/<GRAFANA_VERSION>/administration/data-source-management/
plugin-management:
- pattern: /docs/grafana/
destination: /docs/grafana/<GRAFANA_VERSION>/administration/plugin-management/
- pattern: /docs/grafana-cloud
destination: /docs/grafana/<GRAFANA_VERSION>/administration/plugin-management/
---
# Data sources, plugins, and integrations
When working with Grafana, you'll encounter three key concepts: data sources, plugins, and integrations. Each one is essential in building effective monitoring solutions, but they serve distinct purposes, and are often confused with one another. This document clarifies the meaning of each concept and what each one does, when to use it, and how they work together to create observability solutions in Grafana.
## Data sources
A data source is a connection to a specific database, monitoring system, service, or other external location that stores data, metrics, logs, or traces. Examples include Prometheus, InfluxDB, PostgreSQL, or CloudWatch. When you configure a data source in Grafana, you're telling it where to fetch data from, providing connection details, credentials, and endpoints. Data sources are the foundation for working with Grafana. Without them, Grafana has nothing to visualize. Once configured, you can query your Prometheus data source to display CPU metrics, or query CloudWatch to visualize AWS infrastructure performance.
## Plugins
A plugin extends Grafanas core functionality. Plugins can add new data source types, visualization panels, or full-featured applications that integrate with Grafana. They make Grafana modular and extensible.
Plugins come in three types:
- **Data source plugins** connect Grafana to **external data sources**. You use this type of plugin when you want to access and work with data from an external source or third party. Examples include Prometheus, MSSQL, and Databricks.
- **Panel plugins** control how data appears in Grafana dashboards. Examples of panel plugins include pie chart, candlestick, and traffic light. Note that in some cases, panels don't rely on a data source at all. The **Text** panel can render static or templated content without querying data. Panels can also support user-driven actions. For example, the **Button** panel can trigger workflows or external calls.
- **App plugins** allow you to bundle data sources and panel plugins within a single package. They enable you to create custom pages within Grafana that can function like dashboards, providing dedicated spaces for documentation, sign-up forms, custom UI extensions, and integration with other services via HTTP. Cloud apps built as app plugins offer out-of-the-box observability solutions, such as Azure Cloud Native Monitoring and Redis Application, that provide comprehensive monitoring capabilities compared to standalone integrations
## Integrations
_Integrations are exclusive to Grafana Cloud._ An integration is a pre-packaged monitoring solution that bundles export/scrape configurations, pre-built dashboards, alert rules, and sometimes recording rules. Unlike standalone data sources, integrations handle the complete workflow: they configure how telemetry is collected and sent to Grafana Cloud's hosted databases, then provide ready-to-use dashboards and alerts. For example, a Kubernetes integration configures metric collection from your cluster, creates dashboards for monitoring, and sets up common alerts—all working together out of the box
## When to use each
Use a data source when:
- You want to connect Grafana to a specific system (for example, Prometheus or MySQL).
- Youre building custom dashboards with hand-picked metrics and visualizations.
- Your monitoring needs are unique or not covered by pre-packaged integrations.
Use a plugin when:
- You need to connect to a system Grafana doesnt support natively.
- You want to add new functionality (visualizations, workflows, or app-style extensions).
- You have specialized or industry-specific requirements (for example, IoT).
Use an integration when:
- Youre using Grafana Cloud and want a quick, pre-built setup.
- You prefer minimal configuration with ready-to-use dashboards and alerts.
- Youre new to observability and want to learn what good monitoring looks like.
## Relationships and interactions
How data sources, plugins, and integrations work together:
- Plugins extend what Grafana can do.
- Data sources define where Grafana reads data from.
- Integrations combine telemetry collection and pre-built content to create complete monitoring solutions.
Examples:
- Install the Databricks data source plugin. Configure the Databricks data source and run SQL queries against your Databricks workspace. Use the `Histogram` panel to visualize distributions in your query results, such as latency buckets, job durations, or model output scores.
- Install the Redis Application app plugin. This app provides a unified experience for monitoring Redis by working with your existing Redis data source. It adds custom pages for configuration and exploration, along with prebuilt dashboards, commands, and visualizations that help you analyze performance, memory usage, and key activity.
<!-- - Install the Azure Cloud Native Monitoring app plugin, which bundles the app and data source plugin types. It includes data source plugins for Azure Monitor and Log Analytics, panel plugins for visualizing Azure metrics, and a custom configuration page for managing authentication and subscriptions. -->
- If youre using Grafana Cloud, add the ClickHouse integration. This integration provides pre-built dashboards and alerts to monitor ClickHouse cluster metrics and logs, enabling users to visualize and analyze their ClickHouse performance and health in real-time.
## Frequently asked questions
**What's the difference between a data source and a data source plugin?**
A data source plugin is a **software component that enables Grafana to communicate** with specific types of databases or services, like Prometheus, MySQL, or InfluxDB. A data source is **an actual configured connection** to one of these databases, including the credentials, URL, and settings needed to retrieve data.
Think of it this way: You _install_ a plugin but _configure_ a data source.
**Do I need a plugin to use a data source?**
You must install the plugin before you configure or use the data source. Each data source plugin has its own versioning and lifecycle. Grafana includes built-in core data sources, which can be thought of as pre-installed plugins.
**Can I use integrations in self-hosted Grafana?**
No, integrations are exclusive to Grafana Cloud. In self-hosted Grafana, you can replicate similar setups manually using data sources and dashboards.
**Aren't integrations just pre-built dashboards?**
No, integrations are much more than just dashboards. While dashboards are part of an integration, theyre only one piece. Integrations typically include:
- Data collection setup (for example, pre-configured agents or exporters).
- Predefined metrics and queries tailored to the technology.
- Alerting rules and notifications to help detect common issues.
- Dashboards to visualize and explore that data.
**Whats the difference between plugin types?**
A data source plugin in Grafana is a software component that enables Grafana to connect to and retrieve data from various external data sources. After you install the plugin, you can use it to configure one or more data sources. Each data source defines the actual connection details, like the server URL, authentication method, and query options.
A panel plugin in Grafana is an extension that allows you to add new and custom visualizations to your Grafana dashboards. While Grafana comes with several built-in panel types (like graphs, single stats, and tables), panel plugins extend this functionality by providing specialized ways to display data.
An app plugin in Grafana is a type of plugin that provides a comprehensive, integrated, and often out-of-the-box experience within Grafana. Unlike data source plugins, which connect to external data sources, or panel plugins, which provide new visualization types, app plugins can combine various functionalities to create a more complete experience.
**How do data sources and integrations differ in how they handle data?**
Data sources query data where it already lives. They connect Grafana to an external system or database, such as Prometheus, MySQL, or Elasticsearch and fetch data on demand. You keep full control over your own data stores, schemas and retention policies.
In contrast, integrations focus on getting data into Grafana Clouds hosted backends. They ingest metrics, logs, and traces into systems like Mimir, Loki, or Tempo, using pre-configured agents and pipelines. Instead of querying an external database, Grafana queries its own managed storage where the integration has placed the data.
## Summary reference
Use the following table to compare how data sources, plugins, and integrations differ in scope, purpose, and use. It highlights where each applies within Grafana, what problems it solves, and how they work together to build observability solutions.
| Concept | Where it applies | Purpose | What it includes | When to use it | Example |
| ---------------------- | ---------------------- | ---------------------------------------------------- | ----------------------------------------------------------- | ------------------------------------------------------- | ------------------------------------------ |
| **Data source** | Self-hosted and Cloud | Connect to external metrics, logs, or traces storage | Connection settings, auth, query config | Visualize data from a database or monitoring system | Prometheus, CloudWatch, PostgreSQL |
| **Plugin** | Self-hosted and Cloud | Extend Grafana with new capabilities | Three types: data source, panel, and app | Add connectivity or functionality not included by default | Plotly panel, MongoDB data source |
| **App plugin** | Self-hosted and Cloud | Bundle plugins with custom pages or UI | Data source + panel plugins + custom routes | Create a dedicated app-like experience | Azure Cloud Native Monitoring |
| **Panel plugin** | Self-hosted and Cloud | Add new visualization types | Custom panels and visualization logic | Display data beyond built-in visualizations | Pie chart, Candlestick, Geomap |
| **Data source plugin** | Self-hosted and Cloud | Connect to a new external system type | Connector code for querying that system | Access data from an unsupported backend | Databricks, MongoDB, MSSQL |
| **Integration** | Grafana Cloud only | Pre-packaged observability for a specific technology | Telemetry config, dashboards, alerts, recording rules | Get an out-of-the-box setup with minimal configuration | Kubernetes, Redis, NGINX |
For detailed documentation and how-to guides related to data sources, plugins, and integrations, refer to the following references:
**Data sources**:
- [Manage data sources](ref:data-source-management)
**Plugins**:
- [Plugin types and usage](https://grafana.com/developers/plugin-tools/key-concepts/plugin-types-usage)
- [App plugins](https://grafana.com/developers/plugin-tools/how-to-guides/app-plugins/)
- [Data source plugins](https://grafana.com/developers/plugin-tools/how-to-guides/data-source-plugins/)
- [Panel plugins](https://grafana.com/developers/plugin-tools/how-to-guides/panel-plugins/)
**Integrations**:
- [Grafana integrations](https://grafana.com/docs/grafana-cloud/monitor-infrastructure/integrations/)
- [Install and manage integrations](https://grafana.com/docs/grafana-cloud/monitor-infrastructure/integrations/install-and-manage-integrations/)

View File

@@ -40,7 +40,6 @@ Most [generally available](https://grafana.com/docs/release-life-cycle/#general-
| `transformationsRedesign` | Enables the transformations redesign | Yes |
| `awsAsyncQueryCaching` | Enable caching for async queries for Redshift and Athena. Requires that the datasource has caching and async query support enabled | Yes |
| `dashgpt` | Enable AI powered features in dashboards | Yes |
| `panelMonitoring` | Enables panel monitoring through logs and measurements | Yes |
| `formatString` | Enable format string transformer | Yes |
| `kubernetesDashboards` | Use the kubernetes API in the frontend for dashboards | Yes |
| `addFieldFromCalculationStatFunctions` | Add cumulative and window functions to the add field from calculation transformation | Yes |

View File

@@ -1,16 +1,9 @@
import { Page, Locator } from '@playwright/test';
import { test, expect } from '@grafana/plugin-e2e';
import testDashboard from '../dashboards/AdHocFilterTest.json';
import { getCell } from '../panels-suite/table-utils';
// Helper function to get a specific cell in a table
const getCell = async (loc: Page | Locator, rowIdx: number, colIdx: number) =>
loc
.getByRole('row')
.nth(rowIdx)
.getByRole(rowIdx === 0 ? 'columnheader' : 'gridcell')
.nth(colIdx);
const fixture = require('../fixtures/prometheus-response.json');
test.describe(
'Dashboard with Table powered by Prometheus data source',
@@ -46,80 +39,90 @@ test.describe(
gotoDashboardPage,
selectors,
}) => {
// Handle query and query_range API calls
// Handle query and query_range API calls. Ideally, this would instead be directly tested against gdev-prometheus.
await page.route(/\/api\/ds\/query/, async (route) => {
const fixture = require('../fixtures/prometheus-response.json');
// during the test, we select the "inner_eval" slice to filter; this simulates the behavior
// of prometheus applying that filter and removing dataframes from the response.
if (route.request().postData()?.includes('{slice=\\\"inner_eval\\\"}')) {
fixture.results.A.frames.splice(1, 1);
const response = JSON.parse(JSON.stringify(fixture));
// This simulates the behavior of prometheus applying a filter and removing dataframes from the response where
// the label matches the selected filter. We check for either the slice being applied inline into the prometheus
// query or the adhoc filter being present in the request body of prometheus applying that filter and removing
// dataframes from the response.
const postData = route.request().postData();
const match =
postData?.match(/{slice=\\\"([\w_]+)\\\"}/) ??
postData?.match(/"adhocFilters":\[{"key":"slice","operator":"equals","value":"([\w_]+)"}\]/);
if (match) {
response.results.A.frames = response.results.A.frames.filter((frame) =>
frame.schema.fields.every((field) => !field.labels || field.labels.slice === match[1])
);
}
await route.fulfill({
status: 200,
contentType: 'application/json',
body: JSON.stringify(fixture),
body: JSON.stringify(response),
});
});
const dashboardPage = await gotoDashboardPage({ uid: dashboardUID });
const panel = dashboardPage.getByGrafanaSelector(
let panel = dashboardPage.getByGrafanaSelector(
selectors.components.Panels.Panel.title('Table powered by Prometheus')
);
await expect(panel).toBeVisible();
await expect(panel, 'panel is rendered').toBeVisible();
// Wait for the table to load completely
await expect(panel.locator('.rdg')).toBeVisible();
const table = panel.locator('.rdg');
await expect(table, 'table is rendered').toBeVisible();
// Get the first data cell in the third column (row 1, column 2)
const labelValueCell = await getCell(panel, 1, 1);
await expect(labelValueCell).toBeVisible();
const firstValue = (await getCell(table, 1, 1).textContent())!;
const secondValue = (await getCell(table, 2, 1).textContent())!;
expect(firstValue, `first cell is "${firstValue}"`).toBeTruthy();
expect(secondValue, `second cell is "${secondValue}"`).toBeTruthy();
expect(firstValue, 'first and second cell values are different').not.toBe(secondValue);
// Get the cell value before clicking the filter button
const labelValue = await labelValueCell.textContent();
expect(labelValue).toBeTruthy();
async function performTest(labelValue: string) {
// Confirm both cells are rendered before we proceed
const otherValue = labelValue === firstValue ? secondValue : firstValue;
await expect(table.getByText(labelValue), `"${labelValue}" is rendered`).toContainText(labelValue);
await expect(table.getByText(otherValue), `"${otherValue}" is rendered`).toContainText(otherValue);
const otherValueCell = await getCell(panel, 2, 1);
const otherValueLabel = await otherValueCell.textContent();
expect(otherValueLabel).toBeTruthy();
expect(otherValueLabel).not.toBe(labelValue);
// click the "Filter for value" button on the cell with the specified labelValue
await table.getByText(labelValue).hover();
table.getByText(labelValue).getByRole('button', { name: 'Filter for value' }).click();
// Hover over the first cell to trigger the appearance of filter actions
await labelValueCell.hover();
// Look for submenu items that contain the filtered value
// The adhoc filter should appear as a filter chip or within the variable controls
const submenuItems = dashboardPage.getByGrafanaSelector(selectors.pages.Dashboard.SubMenu.submenuItem);
await expect(submenuItems.filter({ hasText: labelValue }), `submenu contains "${labelValue}"`).toBeVisible();
await expect(
submenuItems.filter({ hasText: otherValue }),
`submenu does not contain "${otherValue}"`
).toBeHidden();
// Check if the "Filter for value" button appears on hover
const filterForValueButton = labelValueCell.getByRole('button', { name: 'Filter for value' });
await expect(filterForValueButton).toBeVisible();
// The URL parameter should contain the filter in format like: var-PromAdHoc=["columnName","=","value"]
const currentUrl = page.url();
const urlParams = new URLSearchParams(new URL(currentUrl).search);
const promAdHocParam = urlParams.get('var-PromAdHoc');
expect(promAdHocParam, `url contains "${labelValue}"`).toContain(labelValue);
expect(promAdHocParam, `url does not contain "${otherValue}"`).not.toContain(otherValue);
// Click on the "Filter for value" button
await filterForValueButton.click();
// finally, let's check that the table was updated and that the value was filtered out when the query was re-run
await expect(table.getByText(labelValue), `"${labelValue}" is still visible`).toHaveText(labelValue);
await expect(table.getByText(otherValue), `"${otherValue}" is filtered out`).toBeHidden();
// Check if the adhoc filter appears in the dashboard submenu
const submenuItems = dashboardPage.getByGrafanaSelector(selectors.pages.Dashboard.SubMenu.submenuItem);
await expect(submenuItems.first()).toBeVisible();
// Remove the adhoc filter by clicking the submenu item again
const filterChip = submenuItems.filter({ hasText: labelValue });
await filterChip.getByLabel(/Remove filter with key/).click();
await page.click('body', { position: { x: 0, y: 0 } }); // click outside to close the open menu from ad-hoc filters
// Look for submenu items that contain the filtered value
// The adhoc filter should appear as a filter chip or within the variable controls
const hasFilterValue = await submenuItems.filter({ hasText: labelValue! }).count();
expect(hasFilterValue).toBeGreaterThan(0);
// the "first" and "second" cells locators don't work here for some reason.
await expect(table.getByText(labelValue), `"${labelValue}" is still rendered`).toContainText(labelValue);
await expect(table.getByText(otherValue), `"${otherValue}" is rendered again`).toContainText(otherValue);
}
const hasOtherValue = await submenuItems.filter({ hasText: otherValueLabel! }).count();
expect(hasOtherValue).toBe(0);
// Check if the URL contains the var-PromAdHoc parameter with the filtered value
const currentUrl = page.url();
expect(currentUrl).toContain('var-PromAdHoc');
// The URL parameter should contain the filter in format like: var-PromAdHoc=["columnName","=","value"]
const urlParams = new URLSearchParams(new URL(currentUrl).search);
const promAdHocParam = urlParams.get('var-PromAdHoc');
expect(promAdHocParam).toBeTruthy();
expect(promAdHocParam).toContain(labelValue!);
expect(promAdHocParam).not.toContain(otherValueLabel!);
// finally, let's check that the table was updated and that the value was filtered out when the query was re-run
await expect(otherValueCell).toBeHidden();
await performTest(firstValue);
await performTest(secondValue);
});
}
);

View File

@@ -17,7 +17,7 @@ test.describe(
tag: ['@dashboards'],
},
() => {
test.fixme('Tests dashboard time zone scenarios', async ({ page, gotoDashboardPage, selectors }) => {
test('Tests dashboard time zone scenarios', async ({ page, gotoDashboardPage, selectors }) => {
const dashboardPage = await gotoDashboardPage({ uid: TIMEZONE_DASHBOARD_UID });
const fromTimeZone = 'UTC';
@@ -106,12 +106,18 @@ test.describe(
zone: 'Browser',
});
await expect(
dashboardPage
.getByGrafanaSelector(selectors.components.Panels.Panel.title('Panel with relative time override'))
.locator('[role="row"]')
.filter({ hasText: '00:00:00' })
).toBeVisible();
const relativeTimeRow = dashboardPage
.getByGrafanaSelector(selectors.components.Panels.Panel.title('Panel with relative time override'))
.locator('[role="row"]')
.filter({ hasText: '00:00:00' })
.first();
const timezoneRow = dashboardPage
.getByGrafanaSelector(selectors.components.Panels.Panel.title('Panel in timezone'))
.locator('[role="row"]')
.filter({ hasText: '00:00:00' })
.first();
await expect(relativeTimeRow).toBeVisible();
// Today so far, still in Browser timezone
await setTimeRange(page, dashboardPage, selectors, {
@@ -119,19 +125,8 @@ test.describe(
to: 'now',
});
await expect(
dashboardPage
.getByGrafanaSelector(selectors.components.Panels.Panel.title('Panel with relative time override'))
.locator('[role="row"]')
.filter({ hasText: '00:00:00' })
).toBeVisible();
await expect(
dashboardPage
.getByGrafanaSelector(selectors.components.Panels.Panel.title('Panel in timezone'))
.locator('[role="row"]')
.filter({ hasText: '00:00:00' })
).toBeVisible();
await expect(relativeTimeRow).toBeVisible();
await expect(timezoneRow).toBeVisible();
// Test UTC timezone
await setTimeRange(page, dashboardPage, selectors, {
@@ -140,12 +135,7 @@ test.describe(
zone: 'Coordinated Universal Time',
});
await expect(
dashboardPage
.getByGrafanaSelector(selectors.components.Panels.Panel.title('Panel with relative time override'))
.locator('[role="row"]')
.filter({ hasText: '00:00:00' })
).toBeVisible();
await expect(relativeTimeRow).toBeVisible();
// Today so far, still in UTC timezone
await setTimeRange(page, dashboardPage, selectors, {
@@ -153,19 +143,8 @@ test.describe(
to: 'now',
});
await expect(
dashboardPage
.getByGrafanaSelector(selectors.components.Panels.Panel.title('Panel with relative time override'))
.locator('[role="row"]')
.filter({ hasText: '00:00:00' })
).toBeVisible();
await expect(
dashboardPage
.getByGrafanaSelector(selectors.components.Panels.Panel.title('Panel in timezone'))
.locator('[role="row"]')
.filter({ hasText: '00:00:00' })
).toBeVisible();
await expect(relativeTimeRow).toBeVisible();
await expect(timezoneRow).toBeVisible();
// Test Tokyo timezone
await setTimeRange(page, dashboardPage, selectors, {
@@ -174,12 +153,7 @@ test.describe(
zone: 'Asia/Tokyo',
});
await expect(
dashboardPage
.getByGrafanaSelector(selectors.components.Panels.Panel.title('Panel with relative time override'))
.locator('[role="row"]')
.filter({ hasText: '00:00:00' })
).toBeVisible();
await expect(relativeTimeRow).toBeVisible();
// Today so far, still in Tokyo timezone
await setTimeRange(page, dashboardPage, selectors, {
@@ -187,19 +161,8 @@ test.describe(
to: 'now',
});
await expect(
dashboardPage
.getByGrafanaSelector(selectors.components.Panels.Panel.title('Panel with relative time override'))
.locator('[role="row"]')
.filter({ hasText: '00:00:00' })
).toBeVisible();
await expect(
dashboardPage
.getByGrafanaSelector(selectors.components.Panels.Panel.title('Panel in timezone'))
.locator('[role="row"]')
.filter({ hasText: '00:00:00' })
).toBeVisible();
await expect(relativeTimeRow).toBeVisible();
await expect(timezoneRow).toBeVisible();
// Test LA timezone
await setTimeRange(page, dashboardPage, selectors, {
@@ -208,12 +171,7 @@ test.describe(
zone: 'America/Los Angeles',
});
await expect(
dashboardPage
.getByGrafanaSelector(selectors.components.Panels.Panel.title('Panel with relative time override'))
.locator('[role="row"]')
.filter({ hasText: '00:00:00' })
).toBeVisible();
await expect(relativeTimeRow).toBeVisible();
// Today so far, still in LA timezone
await setTimeRange(page, dashboardPage, selectors, {
@@ -221,19 +179,8 @@ test.describe(
to: 'now',
});
await expect(
dashboardPage
.getByGrafanaSelector(selectors.components.Panels.Panel.title('Panel with relative time override'))
.locator('[role="row"]')
.filter({ hasText: '00:00:00' })
).toBeVisible();
await expect(
dashboardPage
.getByGrafanaSelector(selectors.components.Panels.Panel.title('Panel in timezone'))
.locator('[role="row"]')
.filter({ hasText: '00:00:00' })
).toBeVisible();
await expect(relativeTimeRow).toBeVisible();
await expect(timezoneRow).toBeVisible();
});
}
);

View File

@@ -65,11 +65,11 @@ test.describe('Panels test: Table - Kitchen Sink', { tag: ['@panels', '@table']
await expect(getCellHeight(page, 1, longTextColIdx)).resolves.toBeLessThan(100);
// test that hover overflow works.
const loremIpsumCell = await getCell(page, 1, longTextColIdx);
const loremIpsumCell = getCell(page, 1, longTextColIdx);
await loremIpsumCell.scrollIntoViewIfNeeded();
await loremIpsumCell.hover();
await expect(getCellHeight(page, 1, longTextColIdx)).resolves.toBeGreaterThan(100);
await (await getCell(page, 1, longTextColIdx + 1)).hover();
await getCell(page, 1, longTextColIdx + 1).hover();
await expect(getCellHeight(page, 1, longTextColIdx)).resolves.toBeLessThan(100);
// enable cell inspect, confirm that hover no longer triggers.
@@ -140,15 +140,15 @@ test.describe('Panels test: Table - Kitchen Sink', { tag: ['@panels', '@table']
).toBeVisible();
// click the "State" column header to sort it.
const stateColumnHeader = await getCell(page, 0, 1);
const stateColumnHeader = getCell(page, 0, 1);
await stateColumnHeader.getByText('Info').click();
await expect(stateColumnHeader).toHaveAttribute('aria-sort', 'ascending');
expect(getCell(page, 1, 1)).resolves.toContainText('down'); // down or down fast
await expect(getCell(page, 1, 1)).toContainText('down'); // down or down fast
await stateColumnHeader.getByText('Info').click();
await expect(stateColumnHeader).toHaveAttribute('aria-sort', 'descending');
expect(getCell(page, 1, 1)).resolves.toContainText('up'); // up or up fast
await expect(getCell(page, 1, 1)).toContainText('up'); // up or up fast
await stateColumnHeader.getByText('Info').click();
await expect(stateColumnHeader).not.toHaveAttribute('aria-sort');
@@ -171,7 +171,7 @@ test.describe('Panels test: Table - Kitchen Sink', { tag: ['@panels', '@table']
const stateColumnHeader = page.getByRole('columnheader').nth(infoColumnIdx);
// get the first value in the "State" column, filter it out, then check that it went away.
const firstStateValue = (await (await getCell(page, 1, infoColumnIdx)).textContent())!;
const firstStateValue = (await getCell(page, 1, infoColumnIdx).textContent())!;
await stateColumnHeader.getByTestId(selectors.components.Panels.Visualization.TableNG.Filters.HeaderButton).click();
const filterContainer = dashboardPage.getByGrafanaSelector(
selectors.components.Panels.Visualization.TableNG.Filters.Container
@@ -188,7 +188,7 @@ test.describe('Panels test: Table - Kitchen Sink', { tag: ['@panels', '@table']
await expect(filterContainer).not.toBeVisible();
// did it actually filter out our value?
await expect(getCell(page, 1, infoColumnIdx)).resolves.not.toHaveText(firstStateValue);
await expect(getCell(page, 1, infoColumnIdx)).not.toHaveText(firstStateValue);
});
test('Tests pagination, row height adjustment', async ({ gotoDashboardPage, selectors, page }) => {
@@ -289,7 +289,7 @@ test.describe('Panels test: Table - Kitchen Sink', { tag: ['@panels', '@table']
const dataLinkColIdx = await getColumnIdx(page, 'Data Link');
// Info column has a single DataLink by default.
const infoCell = await getCell(page, 1, infoColumnIdx);
const infoCell = getCell(page, 1, infoColumnIdx);
await expect(infoCell.locator('a')).toBeVisible();
expect(infoCell.locator('a')).toHaveAttribute('href');
expect(infoCell.locator('a')).not.toHaveAttribute('aria-haspopup');
@@ -306,7 +306,7 @@ test.describe('Panels test: Table - Kitchen Sink', { tag: ['@panels', '@table']
continue;
}
const cell = await getCell(page, 1, colIdx);
const cell = getCell(page, 1, colIdx);
await expect(cell.locator('a')).toBeVisible();
expect(cell.locator('a')).toHaveAttribute('href');
expect(cell.locator('a')).not.toHaveAttribute('aria-haspopup', 'menu');
@@ -319,7 +319,7 @@ test.describe('Panels test: Table - Kitchen Sink', { tag: ['@panels', '@table']
// loop thru the columns, click the links, observe that the tooltip appears, and close the tooltip.
for (let colIdx = 0; colIdx < colCount; colIdx++) {
const cell = await getCell(page, 1, colIdx);
const cell = getCell(page, 1, colIdx);
if (colIdx === infoColumnIdx) {
// the Info column should still have its single link.
expect(cell.locator('a')).not.toHaveAttribute('aria-haspopup', 'menu');
@@ -433,7 +433,7 @@ test.describe('Panels test: Table - Kitchen Sink', { tag: ['@panels', '@table']
await filterContainer.getByTitle('up', { exact: true }).locator('label').click();
await filterContainer.getByRole('button', { name: 'Ok' }).click();
const cell = await getCell(page, 1, dataLinkColumnIdx);
const cell = getCell(page, 1, dataLinkColumnIdx);
await expect(cell).toBeVisible();
await expect(cell).toHaveCSS('text-decoration', /line-through/);

View File

@@ -1,6 +1,6 @@
import { Page, Locator } from '@playwright/test';
export const getCell = async (loc: Page | Locator, rowIdx: number, colIdx: number) =>
export const getCell = (loc: Page | Locator, rowIdx: number, colIdx: number) =>
loc
.getByRole('row')
.nth(rowIdx)
@@ -8,7 +8,7 @@ export const getCell = async (loc: Page | Locator, rowIdx: number, colIdx: numbe
.nth(colIdx);
export const getCellHeight = async (loc: Page | Locator, rowIdx: number, colIdx: number) => {
const cell = await getCell(loc, rowIdx, colIdx);
const cell = getCell(loc, rowIdx, colIdx);
return (await cell.boundingBox())?.height ?? 0;
};
@@ -18,7 +18,7 @@ export const getColumnIdx = async (loc: Page | Locator, columnName: string) => {
let result = -1;
const colCount = await loc.getByRole('columnheader').count();
for (let colIdx = 0; colIdx < colCount; colIdx++) {
const cell = await getCell(loc, 0, colIdx);
const cell = getCell(loc, 0, colIdx);
if ((await cell.textContent()) === columnName) {
result = colIdx;
break;

View File

@@ -322,8 +322,7 @@
"@react-aria/focus": "3.21.2",
"@react-aria/overlays": "3.30.0",
"@react-aria/utils": "3.31.0",
"@react-awesome-query-builder/core": "^6.7.0-alpha.0",
"@react-awesome-query-builder/ui": "^6.7.0-alpha.0",
"@react-awesome-query-builder/ui": "6.6.15",
"@reduxjs/toolkit": "2.9.0",
"@visx/event": "3.12.0",
"@visx/gradient": "3.12.0",

View File

@@ -435,6 +435,7 @@ export {
isStandardFieldProp,
type OptionDefaults,
} from './panel/getPanelOptionsWithDefaults';
export { type PanelDataSummary, getPanelDataSummary } from './panel/suggestions/getPanelDataSummary';
export { createFieldConfigRegistry } from './panel/registryFactories';
export { type QueryRunner, type QueryRunnerOptions } from './types/queryRunner';
export { type GroupingToMatrixTransformerOptions } from './transformations/transformers/groupingToMatrix';
@@ -651,7 +652,6 @@ export {
type AngularPanelMenuItem,
type PanelPluginDataSupport,
type VisualizationSuggestion,
type PanelDataSummary,
type VisualizationSuggestionsSupplier,
VizOrientation,
VisualizationSuggestionScore,

View File

@@ -0,0 +1,94 @@
import { createDataFrame } from '../../dataframe/processDataFrame';
import { FieldType } from '../../types/dataFrame';
import { getPanelDataSummary } from './getPanelDataSummary';
describe('getPanelDataSummary', () => {
describe('when called with no dataframes', () => {
it('should return summary with zero counts', () => {
const summary = getPanelDataSummary();
expect(summary.rowCountTotal).toBe(0);
expect(summary.rowCountMax).toBe(0);
expect(summary.fieldCount).toBe(0);
expect(summary.frameCount).toBe(0);
expect(summary.hasData).toBe(false);
expect(summary.fieldCountByType(FieldType.time)).toBe(0);
expect(summary.fieldCountByType(FieldType.number)).toBe(0);
expect(summary.fieldCountByType(FieldType.string)).toBe(0);
expect(summary.fieldCountByType(FieldType.boolean)).toBe(0);
expect(summary.hasFieldType(FieldType.time)).toBe(false);
expect(summary.hasFieldType(FieldType.number)).toBe(false);
expect(summary.hasFieldType(FieldType.string)).toBe(false);
expect(summary.hasFieldType(FieldType.boolean)).toBe(false);
});
});
describe('when called with a single dataframes', () => {
it('should return correct summary', () => {
const frames = [
createDataFrame({
fields: [
{ name: 'time', type: FieldType.time, values: [1, 2, 3] },
{ name: 'value', type: FieldType.number, values: [10, 20, 30] },
],
}),
];
const summary = getPanelDataSummary(frames);
expect(summary.rowCountTotal).toBe(3);
expect(summary.rowCountMax).toBe(3);
expect(summary.fieldCount).toBe(2);
expect(summary.frameCount).toBe(1);
expect(summary.hasData).toBe(true);
expect(summary.fieldCountByType(FieldType.time)).toBe(1);
expect(summary.fieldCountByType(FieldType.number)).toBe(1);
expect(summary.fieldCountByType(FieldType.string)).toBe(0);
expect(summary.fieldCountByType(FieldType.boolean)).toBe(0);
expect(summary.hasFieldType(FieldType.time)).toBe(true);
expect(summary.hasFieldType(FieldType.number)).toBe(true);
expect(summary.hasFieldType(FieldType.string)).toBe(false);
expect(summary.hasFieldType(FieldType.boolean)).toBe(false);
});
});
describe('when called with multiple dataframes', () => {
it('should return correct summary', () => {
const frames = [
createDataFrame({
fields: [
{ name: 'time', type: FieldType.time, values: [1, 2, 3] },
{ name: 'value', type: FieldType.number, values: [10, 20, 30] },
],
}),
createDataFrame({
fields: [
{ name: 'category', type: FieldType.string, values: ['A', 'B'] },
{ name: 'amount', type: FieldType.number, values: [100, 200] },
],
}),
];
const summary = getPanelDataSummary(frames);
expect(summary.rowCountTotal).toBe(5);
expect(summary.rowCountMax).toBe(3);
expect(summary.fieldCount).toBe(4);
expect(summary.frameCount).toBe(2);
expect(summary.hasData).toBe(true);
expect(summary.fieldCountByType(FieldType.time)).toBe(1);
expect(summary.fieldCountByType(FieldType.number)).toBe(2);
expect(summary.fieldCountByType(FieldType.string)).toBe(1);
expect(summary.fieldCountByType(FieldType.boolean)).toBe(0);
expect(summary.hasFieldType(FieldType.time)).toBe(true);
expect(summary.hasFieldType(FieldType.number)).toBe(true);
expect(summary.hasFieldType(FieldType.string)).toBe(true);
expect(summary.hasFieldType(FieldType.boolean)).toBe(false);
});
});
});

View File

@@ -0,0 +1,82 @@
import { PreferredVisualisationType } from '../../types/data';
import { DataFrame, FieldType } from '../../types/dataFrame';
/**
* @alpha
*/
export interface PanelDataSummary {
hasData?: boolean;
rowCountTotal: number;
rowCountMax: number;
frameCount: number;
fieldCount: number;
fieldCountByType: (type: FieldType) => number;
hasFieldType: (type: FieldType) => boolean;
/** The first frame that set's this value */
preferredVisualisationType?: PreferredVisualisationType;
/* --- DEPRECATED FIELDS BELOW --- */
/** @deprecated use PanelDataSummary.fieldCountByType(FieldType.number) */
numberFieldCount: number;
/** @deprecated use PanelDataSummary.fieldCountByType(FieldType.time) */
timeFieldCount: number;
/** @deprecated use PanelDataSummary.fieldCountByType(FieldType.string) */
stringFieldCount: number;
/** @deprecated use PanelDataSummary.hasFieldType(FieldType.number) */
hasNumberField?: boolean;
/** @deprecated use PanelDataSummary.hasFieldType(FieldType.time) */
hasTimeField?: boolean;
/** @deprecated use PanelDataSummary.hasFieldType(FieldType.string) */
hasStringField?: boolean;
}
/**
* @alpha
* given a list of dataframes, summarize attributes of those frames for features like suggestions.
* @param frames - dataframes to summarize
* @returns summary of the dataframes
*/
export function getPanelDataSummary(frames: DataFrame[] = []): PanelDataSummary {
let rowCountTotal = 0;
let rowCountMax = 0;
let fieldCount = 0;
const countByType: Partial<Record<FieldType, number>> = {};
let preferredVisualisationType: PreferredVisualisationType | undefined;
for (const frame of frames) {
rowCountTotal += frame.length;
if (frame.meta?.preferredVisualisationType) {
preferredVisualisationType = frame.meta.preferredVisualisationType;
}
for (const field of frame.fields) {
fieldCount++;
countByType[field.type] = (countByType[field.type] || 0) + 1;
}
if (frame.length > rowCountMax) {
rowCountMax = frame.length;
}
}
const fieldCountByType = (f: FieldType) => countByType[f] ?? 0;
return {
rowCountTotal,
rowCountMax,
fieldCount,
preferredVisualisationType,
frameCount: frames.length,
hasData: rowCountTotal > 0,
hasFieldType: (f: FieldType) => fieldCountByType(f) > 0,
fieldCountByType,
// deprecated
numberFieldCount: fieldCountByType(FieldType.number),
timeFieldCount: fieldCountByType(FieldType.time),
stringFieldCount: fieldCountByType(FieldType.string),
hasTimeField: fieldCountByType(FieldType.time) > 0,
hasNumberField: fieldCountByType(FieldType.number) > 0,
hasStringField: fieldCountByType(FieldType.string) > 0,
};
}

View File

@@ -248,11 +248,6 @@ export interface FeatureToggles {
*/
externalServiceAccounts?: boolean;
/**
* Enables panel monitoring through logs and measurements
* @default true
*/
panelMonitoring?: boolean;
/**
* Enables native HTTP Histograms
*/
enableNativeHTTPHistogram?: boolean;

View File

@@ -2,14 +2,15 @@ import { defaultsDeep } from 'lodash';
import { EventBus } from '../events/types';
import { StandardEditorProps } from '../field/standardFieldConfigEditorRegistry';
import { PanelDataSummary, getPanelDataSummary } from '../panel/suggestions/getPanelDataSummary';
import { Registry } from '../utils/Registry';
import { OptionsEditorItem } from './OptionsUIRegistryBuilder';
import { ScopedVars } from './ScopedVars';
import { AlertStateInfo } from './alerts';
import { PanelModel } from './dashboard';
import { LoadingState, PreferredVisualisationType } from './data';
import { DataFrame, FieldType } from './dataFrame';
import { LoadingState } from './data';
import { DataFrame } from './dataFrame';
import { DataQueryError, DataQueryRequest, DataQueryTimings } from './datasource';
import { FieldConfigSource } from './fieldOverrides';
import { IconName } from './icon';
@@ -258,25 +259,6 @@ export enum VisualizationSuggestionScore {
OK = 50,
}
/**
* @alpha
*/
export interface PanelDataSummary {
hasData?: boolean;
rowCountTotal: number;
rowCountMax: number;
frameCount: number;
fieldCount: number;
numberFieldCount: number;
timeFieldCount: number;
stringFieldCount: number;
hasNumberField?: boolean;
hasTimeField?: boolean;
hasStringField?: boolean;
/** The first frame that set's this value */
preferredVisualisationType?: PreferredVisualisationType;
}
/**
* @alpha
*/
@@ -293,68 +275,13 @@ export class VisualizationSuggestionsBuilder {
constructor(data?: PanelData, panel?: PanelModel) {
this.data = data;
this.panel = panel;
this.dataSummary = this.computeDataSummary();
this.dataSummary = getPanelDataSummary(this.data?.series);
}
getListAppender<TOptions, TFieldConfig>(defaults: VisualizationSuggestion<TOptions, TFieldConfig>) {
return new VisualizationSuggestionsListAppender<TOptions, TFieldConfig>(this.list, defaults);
}
private computeDataSummary() {
const frames = this.data?.series || [];
let numberFieldCount = 0;
let timeFieldCount = 0;
let stringFieldCount = 0;
let rowCountTotal = 0;
let rowCountMax = 0;
let fieldCount = 0;
let preferredVisualisationType: PreferredVisualisationType | undefined;
for (const frame of frames) {
rowCountTotal += frame.length;
if (frame.meta?.preferredVisualisationType) {
preferredVisualisationType = frame.meta.preferredVisualisationType;
}
for (const field of frame.fields) {
fieldCount++;
switch (field.type) {
case FieldType.number:
numberFieldCount += 1;
break;
case FieldType.time:
timeFieldCount += 1;
break;
case FieldType.string:
stringFieldCount += 1;
break;
}
}
if (frame.length > rowCountMax) {
rowCountMax = frame.length;
}
}
return {
numberFieldCount,
timeFieldCount,
stringFieldCount,
rowCountTotal,
rowCountMax,
fieldCount,
preferredVisualisationType,
frameCount: frames.length,
hasData: rowCountTotal > 0,
hasTimeField: timeFieldCount > 0,
hasNumberField: numberFieldCount > 0,
hasStringField: stringFieldCount > 0,
};
}
getList() {
return this.list;
}

View File

@@ -22,8 +22,7 @@
"@grafana/plugin-ui": "^0.10.10",
"@grafana/runtime": "12.4.0-pre",
"@grafana/ui": "12.4.0-pre",
"@react-awesome-query-builder/core": "^6.7.0-alpha.0",
"@react-awesome-query-builder/ui": "^6.7.0-alpha.0",
"@react-awesome-query-builder/ui": "6.6.15",
"immutable": "5.1.4",
"lodash": "4.17.21",
"react": "18.3.1",

View File

@@ -1,4 +1,3 @@
import React, { Suspense } from 'react';
import { useAsync } from 'react-use';
import { SelectableValue, TypedVariableModel } from '@grafana/data';
@@ -8,9 +7,8 @@ import { QueryWithDefaults } from '../../defaults';
import { DB, SQLExpression, SQLQuery, SQLSelectableValue } from '../../types';
import { useSqlChange } from '../../utils/useSqlChange';
import type { Config } from './AwesomeQueryBuilder';
const LazyWhereRow = React.lazy(() => import(/* webpackChunkName: "sql-editor-where-row" */ './WhereRow'));
import { Config } from './AwesomeQueryBuilder';
import { WhereRow } from './WhereRow';
interface WhereRowProps {
query: QueryWithDefaults;
@@ -27,21 +25,19 @@ export function SQLWhereRow({ query, fields, onQueryChange, db }: WhereRowProps)
const { onSqlChange } = useSqlChange({ query, onQueryChange, db });
return (
<Suspense>
<LazyWhereRow
// TODO: fix key that's used to force clean render or SQLWhereRow - otherwise it doesn't render operators correctly
key={JSON.stringify(state.value)}
config={{ fields: state.value || {} }}
sql={query.sql!}
onSqlChange={(val: SQLExpression) => {
const templateVars = getTemplateSrv().getVariables();
<WhereRow
// TODO: fix key that's used to force clean render or SQLWhereRow - otherwise it doesn't render operators correctly
key={JSON.stringify(state.value)}
config={{ fields: state.value || {} }}
sql={query.sql!}
onSqlChange={(val: SQLExpression) => {
const templateVars = getTemplateSrv().getVariables();
removeQuotesForMultiVariables(val, templateVars);
removeQuotesForMultiVariables(val, templateVars);
onSqlChange(val);
}}
/>
</Suspense>
onSqlChange(val);
}}
/>
);
}

View File

@@ -90,5 +90,3 @@ injectGlobal`
display: none;
}
`;
export default WhereRow;

View File

@@ -1,4 +1,4 @@
import type { JsonTree } from '@react-awesome-query-builder/ui';
import { JsonTree } from '@react-awesome-query-builder/ui';
import {
DataFrame,

View File

@@ -273,7 +273,7 @@ func setupSimpleHTTPServer(features featuremgmt.FeatureToggles) *HTTPServer {
AccessControl: acimpl.ProvideAccessControl(featuremgmt.WithFeatures()),
annotationsRepo: annotationstest.NewFakeAnnotationsRepo(),
authInfoService: &authinfotest.FakeService{
ExpectedLabels: map[int64]string{int64(1): login.GetAuthProviderLabel(login.LDAPAuthModule)},
ExpectedRecentlyUsedLabel: map[int64]string{int64(1): login.GetAuthProviderLabel(login.LDAPAuthModule)},
},
tracer: tracing.InitializeTracerForTest(),
}

View File

@@ -314,7 +314,7 @@ func (hs *HTTPServer) searchOrgUsersHelper(c *contextmodel.ReqContext, query *or
filteredUsers = append(filteredUsers, user)
}
modules, err := hs.authInfoService.GetUserLabels(c.Req.Context(), login.GetUserLabelsQuery{
modules, err := hs.authInfoService.GetUsersRecentlyUsedLabel(c.Req.Context(), login.GetUserLabelsQuery{
UserIDs: authLabelsUserIDs,
})

View File

@@ -115,6 +115,7 @@ func (hs *HTTPServer) GetUserByLoginOrEmail(c *contextmodel.ReqContext) response
}
return response.Error(http.StatusInternalServerError, "Failed to get user", err)
}
result := user.UserProfileDTO{
ID: usr.ID,
UID: usr.UID,
@@ -128,6 +129,11 @@ func (hs *HTTPServer) GetUserByLoginOrEmail(c *contextmodel.ReqContext) response
UpdatedAt: usr.Updated,
CreatedAt: usr.Created,
}
// Populate AuthLabels using all historically used auth modules ordered by most recent.
if modules, err := hs.authInfoService.GetUserAuthModuleLabels(c.Req.Context(), usr.ID); err == nil {
result.AuthLabels = modules
}
return response.JSON(http.StatusOK, &result)
}

View File

@@ -185,6 +185,44 @@ func TestIntegrationUserAPIEndpoint_userLoggedIn(t *testing.T) {
require.NoError(t, err)
}, mock)
// Multiple historical auth labels should appear ordered by recency
loggedInUserScenario(t, "When calling GET returns with multiple auth labels", "/api/users/lookup", "/api/users/lookup", func(sc *scenarioContext) {
createUserCmd := user.CreateUserCommand{
Email: fmt.Sprint("multi", "@test.com"),
Name: "multi",
Login: "multi",
IsAdmin: true,
}
orgSvc, err := orgimpl.ProvideService(sqlStore, sc.cfg, quotatest.New(false, nil))
require.NoError(t, err)
userSvc, err := userimpl.ProvideService(
sqlStore, orgSvc, sc.cfg, nil, nil, tracing.InitializeTracerForTest(),
quotatest.New(false, nil), supportbundlestest.NewFakeBundleService(),
)
require.NoError(t, err)
usr, err := userSvc.Create(context.Background(), &createUserCmd)
require.Nil(t, err)
sc.handlerFunc = hs.GetUserByLoginOrEmail
userMock := usertest.NewUserServiceFake()
userMock.ExpectedUser = &user.User{ID: usr.ID, Email: usr.Email, Login: usr.Login, Name: usr.Name}
sc.userService = userMock
hs.userService = userMock
fakeAuth := &authinfotest.FakeService{ExpectedAuthModuleLabels: []string{login.GetAuthProviderLabel(login.OktaAuthModule), login.GetAuthProviderLabel(login.LDAPAuthModule), login.GetAuthProviderLabel(login.SAMLAuthModule)}}
hs.authInfoService = fakeAuth
sc.fakeReqWithParams("GET", sc.url, map[string]string{"loginOrEmail": usr.Email}).exec()
var resp user.UserProfileDTO
require.Equal(t, http.StatusOK, sc.resp.Code)
err = json.Unmarshal(sc.resp.Body.Bytes(), &resp)
require.NoError(t, err)
expected := []string{login.GetAuthProviderLabel(login.OktaAuthModule), login.GetAuthProviderLabel(login.LDAPAuthModule), login.GetAuthProviderLabel(login.SAMLAuthModule)}
require.Equal(t, expected, resp.AuthLabels)
}, mock)
loggedInUserScenario(t, "When calling GET on", "/api/users", "/api/users", func(sc *scenarioContext) {
userMock.ExpectedSearchUsers = mockResult

View File

@@ -3,10 +3,8 @@ package datasource
import (
"context"
"encoding/json"
"errors"
"fmt"
"maps"
"path/filepath"
"github.com/prometheus/client_golang/prometheus"
metav1 "k8s.io/apimachinery/pkg/apis/meta/v1"
@@ -23,7 +21,6 @@ import (
datasourceV0 "github.com/grafana/grafana/pkg/apis/datasource/v0alpha1"
queryV0 "github.com/grafana/grafana/pkg/apis/query/v0alpha1"
grafanaregistry "github.com/grafana/grafana/pkg/apiserver/registry/generic"
"github.com/grafana/grafana/pkg/configprovider"
"github.com/grafana/grafana/pkg/plugins"
"github.com/grafana/grafana/pkg/plugins/manager/sources"
"github.com/grafana/grafana/pkg/promlib/models"
@@ -31,7 +28,6 @@ import (
"github.com/grafana/grafana/pkg/services/accesscontrol"
"github.com/grafana/grafana/pkg/services/apiserver/builder"
"github.com/grafana/grafana/pkg/services/featuremgmt"
"github.com/grafana/grafana/pkg/setting"
"github.com/grafana/grafana/pkg/tsdb/grafana-testdata-datasource/kinds"
)
@@ -53,7 +49,6 @@ type DataSourceAPIBuilder struct {
}
func RegisterAPIService(
cfgProvider configprovider.ConfigProvider,
features featuremgmt.FeatureToggles,
apiRegistrar builder.APIRegistrar,
pluginClient plugins.Client, // access to everything
@@ -61,6 +56,7 @@ func RegisterAPIService(
contextProvider PluginContextWrapper,
accessControl accesscontrol.AccessControl,
reg prometheus.Registerer,
pluginSources sources.Registry,
) (*DataSourceAPIBuilder, error) {
// We want to expose just a limited set of plugins
//nolint:staticcheck // not yet migrated to OpenFeature
@@ -75,13 +71,9 @@ func RegisterAPIService(
var err error
var builder *DataSourceAPIBuilder
cfg, err := cfgProvider.Get(context.Background())
pluginJSONs, err := getDatasourcePlugins(pluginSources)
if err != nil {
return nil, err
}
pluginJSONs, err := getCorePlugins(cfg)
if err != nil {
return nil, err
return nil, fmt.Errorf("error getting list of datasource plugins: %s", err)
}
ids := []string{
@@ -299,21 +291,29 @@ func (b *DataSourceAPIBuilder) GetOpenAPIDefinitions() openapi.GetOpenAPIDefinit
}
}
func getCorePlugins(cfg *setting.Cfg) ([]plugins.JSONData, error) {
coreDataSourcesPath := filepath.Join(cfg.StaticRootPath, "app", "plugins", "datasource")
coreDataSourcesSrc := sources.NewLocalSource(
plugins.ClassCore,
[]string{coreDataSourcesPath},
)
func getDatasourcePlugins(pluginSources sources.Registry) ([]plugins.JSONData, error) {
var pluginJSONs []plugins.JSONData
res, err := coreDataSourcesSrc.Discover(context.Background())
if err != nil {
return nil, errors.New("failed to load core data source plugins")
}
// It's possible that the same plugin will be found in different sources.
// Registering the same plugin twice in the API is Probably A Bad Thing,
// so this map keeps track of uniques, so we can skip duplicates.
var uniquePlugins = map[string]bool{}
pluginJSONs := make([]plugins.JSONData, 0, len(res))
for _, p := range res {
pluginJSONs = append(pluginJSONs, p.Primary.JSONData)
for _, pluginSource := range pluginSources.List(context.Background()) {
res, err := pluginSource.Discover(context.Background())
if err != nil {
return nil, err
}
for _, p := range res {
if p.Primary.JSONData.Type == plugins.TypeDataSource {
if _, found := uniquePlugins[p.Primary.JSONData.ID]; found {
backend.Logger.Info("Found duplicate plugin %s when registering API groups.", p.Primary.JSONData.ID)
continue
}
uniquePlugins[p.Primary.JSONData.ID] = true
pluginJSONs = append(pluginJSONs, p.Primary.JSONData)
}
}
}
return pluginJSONs, nil
}

View File

@@ -120,6 +120,14 @@ func validateOnUpdate(ctx context.Context,
return err
}
// Check that the folder being moved is not an ancestor of the target parent.
// This prevents circular references (e.g., moving A under B when B is already under A).
for _, ancestor := range info.Items {
if ancestor.Name == obj.Name {
return fmt.Errorf("cannot move folder under its own descendant, this would create a circular reference")
}
}
// if by moving a folder we exceed the max depth, return an error
if len(info.Items) > maxDepth+1 {
return folder.ErrMaximumDepthReached.Errorf("maximum folder depth reached")

View File

@@ -264,6 +264,71 @@ func TestValidateUpdate(t *testing.T) {
maxDepth: folder.MaxNestedFolderDepth,
expectedErr: "[folder.maximum-depth-reached]",
},
{
name: "error when moving folder under its own descendant (direct child)",
folder: &folders.Folder{
ObjectMeta: metav1.ObjectMeta{
Name: "parent",
Annotations: map[string]string{
utils.AnnoKeyFolder: "child",
},
},
Spec: folders.FolderSpec{
Title: "parent folder",
},
},
old: &folders.Folder{
ObjectMeta: metav1.ObjectMeta{
Name: "parent",
},
Spec: folders.FolderSpec{
Title: "parent folder",
},
},
// When querying parents of "child", we get the chain: child -> parent -> root
// This means "parent" is an ancestor of "child", so we can't move "parent" under "child"
parents: &folders.FolderInfoList{
Items: []folders.FolderInfo{
{Name: "child", Parent: "parent"},
{Name: "parent", Parent: folder.GeneralFolderUID},
{Name: folder.GeneralFolderUID},
},
},
expectedErr: "cannot move folder under its own descendant",
},
{
name: "error when moving folder under its grandchild",
folder: &folders.Folder{
ObjectMeta: metav1.ObjectMeta{
Name: "grandparent",
Annotations: map[string]string{
utils.AnnoKeyFolder: "grandchild",
},
},
Spec: folders.FolderSpec{
Title: "grandparent folder",
},
},
old: &folders.Folder{
ObjectMeta: metav1.ObjectMeta{
Name: "grandparent",
},
Spec: folders.FolderSpec{
Title: "grandparent folder",
},
},
// When querying parents of "grandchild", we get: grandchild -> child -> grandparent -> root
// This means "grandparent" is in the ancestry, so we can't move it under "grandchild"
parents: &folders.FolderInfoList{
Items: []folders.FolderInfo{
{Name: "grandchild", Parent: "child"},
{Name: "child", Parent: "grandparent"},
{Name: "grandparent", Parent: folder.GeneralFolderUID},
{Name: folder.GeneralFolderUID},
},
},
expectedErr: "cannot move folder under its own descendant",
},
}
for _, tt := range tests {

View File

@@ -35,6 +35,7 @@ import (
clientset "github.com/grafana/grafana/apps/provisioning/pkg/generated/clientset/versioned"
client "github.com/grafana/grafana/apps/provisioning/pkg/generated/clientset/versioned/typed/provisioning/v0alpha1"
informers "github.com/grafana/grafana/apps/provisioning/pkg/generated/informers/externalversions"
jobsvalidation "github.com/grafana/grafana/apps/provisioning/pkg/jobs"
"github.com/grafana/grafana/apps/provisioning/pkg/loki"
"github.com/grafana/grafana/apps/provisioning/pkg/repository"
"github.com/grafana/grafana/pkg/apimachinery/identity"
@@ -576,10 +577,10 @@ func (b *APIBuilder) Validate(ctx context.Context, a admission.Attributes, o adm
return nil
}
// FIXME: Do nothing for Jobs for now
_, ok = obj.(*provisioning.Job)
// Validate Jobs
job, ok := obj.(*provisioning.Job)
if ok {
return nil
return jobsvalidation.ValidateJob(job)
}
repo, err := b.asRepository(ctx, obj, a.GetOldObject())

View File

@@ -128,6 +128,10 @@ func convertToK8sResource(
return nil, fmt.Errorf("failed to get metadata: %w", err)
}
meta.SetFolder(rule.NamespaceUID)
// Keep metadata label in sync with folder annotation for downstream consumers
if rule.NamespaceUID != "" {
k8sRule.Labels[model.FolderLabelKey] = rule.NamespaceUID
}
if rule.UpdatedBy != nil {
meta.SetUpdatedBy(string(*rule.UpdatedBy))
k8sRule.SetUpdatedBy(string(*rule.UpdatedBy))

View File

@@ -76,6 +76,10 @@ func convertToK8sResource(
return nil, fmt.Errorf("failed to get metadata: %w", err)
}
meta.SetFolder(rule.NamespaceUID)
// Keep metadata label in sync with folder annotation for downstream consumers
if rule.NamespaceUID != "" {
k8sRule.Labels[model.FolderLabelKey] = rule.NamespaceUID
}
if rule.UpdatedBy != nil {
meta.SetUpdatedBy(string(*rule.UpdatedBy))
k8sRule.SetUpdatedBy(string(*rule.UpdatedBy))

View File

@@ -104,7 +104,7 @@ func (s *legacyStorage) Get(ctx context.Context, name string, _ *metav1.GetOptio
return obj, err
}
func (s *legacyStorage) Create(ctx context.Context, obj runtime.Object, _ rest.ValidateObjectFunc, _ *metav1.CreateOptions) (runtime.Object, error) {
func (s *legacyStorage) Create(ctx context.Context, obj runtime.Object, createValidation rest.ValidateObjectFunc, _ *metav1.CreateOptions) (runtime.Object, error) {
info, err := request.NamespaceInfoFrom(ctx, true)
if err != nil {
return nil, err
@@ -114,6 +114,11 @@ func (s *legacyStorage) Create(ctx context.Context, obj runtime.Object, _ rest.V
if err != nil {
return nil, err
}
if createValidation != nil {
if err := createValidation(ctx, obj); err != nil {
return nil, err
}
}
p, ok := obj.(*model.RecordingRule)
if !ok {

View File

@@ -14,13 +14,17 @@ import (
"github.com/grafana/grafana/apps/alerting/rules/pkg/apis"
rulesApp "github.com/grafana/grafana/apps/alerting/rules/pkg/app"
rulesAppConfig "github.com/grafana/grafana/apps/alerting/rules/pkg/app/config"
"github.com/grafana/grafana/pkg/apimachinery/identity"
grafanarest "github.com/grafana/grafana/pkg/apiserver/rest"
"github.com/grafana/grafana/pkg/infra/log"
"github.com/grafana/grafana/pkg/registry/apps/alerting/rules/alertrule"
"github.com/grafana/grafana/pkg/registry/apps/alerting/rules/recordingrule"
"github.com/grafana/grafana/pkg/services/apiserver/appinstaller"
"github.com/grafana/grafana/pkg/services/apiserver/endpoints/request"
reqns "github.com/grafana/grafana/pkg/services/apiserver/endpoints/request"
"github.com/grafana/grafana/pkg/services/ngalert"
ngmodels "github.com/grafana/grafana/pkg/services/ngalert/models"
"github.com/grafana/grafana/pkg/services/ngalert/notifier"
"github.com/grafana/grafana/pkg/setting"
)
@@ -50,11 +54,66 @@ func RegisterAppInstaller(
ng: ng,
}
provider := simple.NewAppProvider(apis.LocalManifest(), nil, rulesApp.New)
appSpecificConfig := rulesAppConfig.RuntimeConfig{
// Validate folder existence using the folder service
FolderValidator: func(ctx context.Context, folderUID string) (bool, error) {
if folderUID == "" {
return false, nil
}
orgID, err := reqns.OrgIDForList(ctx)
user, _ := identity.GetRequester(ctx)
if (err != nil || orgID < 1) && user != nil {
orgID = user.GetOrgID()
}
if user == nil || orgID < 1 {
// If we can't resolve identity/org in this context, don't block creation based on existence
return true, nil
}
// Use the RuleStore to check namespace (folder) visibility
_, err = ng.Api.RuleStore.GetNamespaceByUID(ctx, folderUID, orgID, user)
if err != nil {
return false, nil
}
return true, nil
},
BaseEvaluationInterval: ng.Cfg.UnifiedAlerting.BaseInterval,
ReservedLabelKeys: ngmodels.LabelsUserCannotSpecify,
// Validate that the configured notification receiver exists in the Alertmanager config
NotificationSettingsValidator: func(ctx context.Context, receiver string) (bool, error) {
if receiver == "" {
return false, nil
}
orgID, err := reqns.OrgIDForList(ctx)
if err != nil || orgID < 1 {
if user, _ := identity.GetRequester(ctx); user != nil {
orgID = user.GetOrgID()
}
}
if orgID < 1 {
// Without org context, skip validation rather than block
return true, nil
}
provider := notifier.NewCachedNotificationSettingsValidationService(ng.Api.AlertingStore)
vd, err := provider.Validator(ctx, orgID)
if err != nil {
log.New("alerting.rules.app").Error("failed to create notification settings validator", "error", err)
// If we cannot build a validator, don't block admission
return true, nil
}
// Only validate receiver presence; construct minimal settings
if err := vd.Validate(ngmodels.NotificationSettings{Receiver: receiver}); err != nil {
return false, nil
}
return true, nil
},
}
provider := simple.NewAppProvider(apis.LocalManifest(), appSpecificConfig, rulesApp.New)
appConfig := app.Config{
KubeConfig: restclient.Config{}, // this will be overridden by the installer's InitializeApp method
ManifestData: *apis.LocalManifest().ManifestData,
KubeConfig: restclient.Config{}, // this will be overridden by the installer's InitializeApp method
ManifestData: *apis.LocalManifest().ManifestData,
SpecificConfig: appSpecificConfig,
}
i, err := appsdkapiserver.NewDefaultAppInstaller(provider, appConfig, &apis.GoTypeAssociator{})
@@ -81,7 +140,7 @@ func (a *AlertingRulesAppInstaller) GetAuthorizer() authorizer.Authorizer {
}
func (a *AlertingRulesAppInstaller) GetLegacyStorage(gvr schema.GroupVersionResource) grafanarest.Storage {
namespacer := request.GetNamespaceMapper(a.cfg)
namespacer := reqns.GetNamespaceMapper(a.cfg)
switch gvr {
case recordingrule.ResourceInfo.GroupVersionResource():
return recordingrule.NewStorage(*a.ng.Api.AlertRules, namespacer)

View File

@@ -847,7 +847,7 @@ func Initialize(ctx context.Context, cfg *setting.Cfg, opts Options, apiOpts api
apiService := api4.ProvideService(cfg, routeRegisterImpl, accessControl, userService, authinfoimplService, ossGroups, identitySynchronizer, orgService, ldapImpl, userAuthTokenService, bundleregistryService)
dashboardsAPIBuilder := dashboard.RegisterAPIService(cfg, featureToggles, apiserverService, dashboardService, dashboardProvisioningService, service15, dashboardServiceImpl, dashboardPermissionsService, accessControl, accessClient, provisioningServiceImpl, dashboardsStore, registerer, sqlStore, tracingService, resourceClient, dualwriteService, sortService, quotaService, libraryPanelService, eventualRestConfigProvider, userService, libraryElementService, publicDashboardServiceImpl)
snapshotsAPIBuilder := dashboardsnapshot.RegisterAPIService(serviceImpl, apiserverService, cfg, featureToggles, sqlStore, registerer)
dataSourceAPIBuilder, err := datasource.RegisterAPIService(configProvider, featureToggles, apiserverService, middlewareHandler, scopedPluginDatasourceProvider, plugincontextProvider, accessControl, registerer)
dataSourceAPIBuilder, err := datasource.RegisterAPIService(featureToggles, apiserverService, middlewareHandler, scopedPluginDatasourceProvider, plugincontextProvider, accessControl, registerer, sourcesService)
if err != nil {
return nil, err
}
@@ -1485,7 +1485,7 @@ func InitializeForTest(ctx context.Context, t sqlutil.ITestDB, testingT interfac
apiService := api4.ProvideService(cfg, routeRegisterImpl, accessControl, userService, authinfoimplService, ossGroups, identitySynchronizer, orgService, ldapImpl, userAuthTokenService, bundleregistryService)
dashboardsAPIBuilder := dashboard.RegisterAPIService(cfg, featureToggles, apiserverService, dashboardService, dashboardProvisioningService, service15, dashboardServiceImpl, dashboardPermissionsService, accessControl, accessClient, provisioningServiceImpl, dashboardsStore, registerer, sqlStore, tracingService, resourceClient, dualwriteService, sortService, quotaService, libraryPanelService, eventualRestConfigProvider, userService, libraryElementService, publicDashboardServiceImpl)
snapshotsAPIBuilder := dashboardsnapshot.RegisterAPIService(serviceImpl, apiserverService, cfg, featureToggles, sqlStore, registerer)
dataSourceAPIBuilder, err := datasource.RegisterAPIService(configProvider, featureToggles, apiserverService, middlewareHandler, scopedPluginDatasourceProvider, plugincontextProvider, accessControl, registerer)
dataSourceAPIBuilder, err := datasource.RegisterAPIService(featureToggles, apiserverService, middlewareHandler, scopedPluginDatasourceProvider, plugincontextProvider, accessControl, registerer, sourcesService)
if err != nil {
return nil, err
}

View File

@@ -405,14 +405,6 @@ var (
Stage: FeatureStagePublicPreview,
Owner: identityAccessTeam,
},
{
Name: "panelMonitoring",
Description: "Enables panel monitoring through logs and measurements",
Stage: FeatureStageGeneralAvailability,
Expression: "true", // enabled by default
Owner: grafanaDatavizSquad,
FrontendOnly: true,
},
{
Name: "enableNativeHTTPHistogram",
Description: "Enables native HTTP Histograms",

View File

@@ -52,7 +52,6 @@ reportingRetries,preview,@grafana/grafana-operator-experience-squad,false,true,f
sseGroupByDatasource,experimental,@grafana/observability-metrics,false,false,false
lokiRunQueriesInParallel,privatePreview,@grafana/observability-logs,false,false,false
externalServiceAccounts,preview,@grafana/identity-access-team,false,false,false
panelMonitoring,GA,@grafana/dataviz-squad,false,false,true
enableNativeHTTPHistogram,experimental,@grafana/grafana-backend-services-squad,false,true,false
disableClassicHTTPHistogram,experimental,@grafana/grafana-backend-services-squad,false,true,false
formatString,GA,@grafana/dataviz-squad,false,false,true
1 Name Stage Owner requiresDevMode RequiresRestart FrontendOnly
52 sseGroupByDatasource experimental @grafana/observability-metrics false false false
53 lokiRunQueriesInParallel privatePreview @grafana/observability-logs false false false
54 externalServiceAccounts preview @grafana/identity-access-team false false false
panelMonitoring GA @grafana/dataviz-squad false false true
55 enableNativeHTTPHistogram experimental @grafana/grafana-backend-services-squad false true false
56 disableClassicHTTPHistogram experimental @grafana/grafana-backend-services-squad false true false
57 formatString GA @grafana/dataviz-squad false false true

View File

@@ -219,10 +219,6 @@ const (
// Automatic service account and token setup for plugins
FlagExternalServiceAccounts = "externalServiceAccounts"
// FlagPanelMonitoring
// Enables panel monitoring through logs and measurements
FlagPanelMonitoring = "panelMonitoring"
// FlagEnableNativeHTTPHistogram
// Enables native HTTP Histograms
FlagEnableNativeHTTPHistogram = "enableNativeHTTPHistogram"

View File

@@ -2905,7 +2905,8 @@
"metadata": {
"name": "panelMonitoring",
"resourceVersion": "1753448760331",
"creationTimestamp": "2023-10-09T05:19:08Z"
"creationTimestamp": "2023-10-09T05:19:08Z",
"deletionTimestamp": "2025-11-06T15:46:51Z"
},
"spec": {
"description": "Enables panel monitoring through logs and measurements",

View File

@@ -388,7 +388,9 @@ func (ss *FolderUnifiedStoreImpl) GetFolders(ctx context.Context, q folder.GetFo
}
if (q.WithFullpath || q.WithFullpathUIDs) && f.Fullpath == "" {
buildFolderFullPaths(f, relations, folderMap)
if err := buildFolderFullPaths(f, relations, folderMap); err != nil {
return nil, err
}
}
hits = append(hits, f)
@@ -559,15 +561,21 @@ func computeFullPath(parents []*folder.Folder) (string, string) {
return strings.Join(fullpath, "/"), strings.Join(fullpathUIDs, "/")
}
func buildFolderFullPaths(f *folder.Folder, relations map[string]string, folderMap map[string]*folder.Folder) {
func buildFolderFullPaths(f *folder.Folder, relations map[string]string, folderMap map[string]*folder.Folder) error {
titles := make([]string, 0)
uids := make([]string, 0)
titles = append(titles, f.Title)
uids = append(uids, f.UID)
i := 0
currentUID := f.UID
for currentUID != "" {
// This is just a circuit breaker to prevent infinite loops. We should never reach this limit.
if i > 1000 {
return fmt.Errorf("folder depth exceeds the maximum allowed depth, You might have a circular reference")
}
i++
parentUID, exists := relations[currentUID]
if !exists {
break
@@ -588,6 +596,7 @@ func buildFolderFullPaths(f *folder.Folder, relations map[string]string, folderM
f.Fullpath = strings.Join(util.Reverse(titles), "/")
f.FullpathUIDs = strings.Join(util.Reverse(uids), "/")
return nil
}
func shouldSkipFolder(f *folder.Folder, filterUIDs map[string]struct{}) bool {

View File

@@ -881,7 +881,7 @@ func TestBuildFolderFullPaths(t *testing.T) {
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
buildFolderFullPaths(tt.args.f, tt.args.relations, tt.args.folderMap)
require.NoError(t, buildFolderFullPaths(tt.args.f, tt.args.relations, tt.args.folderMap))
require.Equal(t, tt.want.Fullpath, tt.args.f.Fullpath, "BuildFolderFullPaths() = %v, want %v", tt.args.f.Fullpath, tt.want.Fullpath)
require.Equal(t, tt.want.FullpathUIDs, tt.args.f.FullpathUIDs, "BuildFolderFullPaths() = %v, want %v", tt.args.f.FullpathUIDs, tt.want.FullpathUIDs)
require.Equal(t, tt.want.Title, tt.args.f.Title, "BuildFolderFullPaths() = %v, want %v", tt.args.f.Title, tt.want.Title)

View File

@@ -8,15 +8,18 @@ import (
//go:generate mockery --name AuthInfoService --structname MockAuthInfoService --outpkg authinfotest --filename auth_info_service_mock.go --output ./authinfotest/
type AuthInfoService interface {
GetAuthInfo(ctx context.Context, query *GetAuthInfoQuery) (*UserAuth, error)
GetUserLabels(ctx context.Context, query GetUserLabelsQuery) (map[int64]string, error)
GetUsersRecentlyUsedLabel(ctx context.Context, query GetUserLabelsQuery) (map[int64]string, error)
GetUserAuthModuleLabels(ctx context.Context, userID int64) ([]string, error)
SetAuthInfo(ctx context.Context, cmd *SetAuthInfoCommand) error
UpdateAuthInfo(ctx context.Context, cmd *UpdateAuthInfoCommand) error
DeleteUserAuthInfo(ctx context.Context, userID int64) error
}
//go:generate mockery --name Store --structname MockAuthInfoStore --outpkg authinfotest --filename auth_info_store_mock.go --output ./authinfotest/
type Store interface {
GetAuthInfo(ctx context.Context, query *GetAuthInfoQuery) (*UserAuth, error)
GetUserLabels(ctx context.Context, query GetUserLabelsQuery) (map[int64]string, error)
GetUsersRecentlyUsedLabel(ctx context.Context, query GetUserLabelsQuery) (map[int64]string, error)
GetUserAuthModules(ctx context.Context, userID int64) ([]string, error)
SetAuthInfo(ctx context.Context, cmd *SetAuthInfoCommand) error
UpdateAuthInfo(ctx context.Context, cmd *UpdateAuthInfoCommand) error
DeleteUserAuthInfo(ctx context.Context, userID int64) error

View File

@@ -67,11 +67,28 @@ func (s *Service) GetAuthInfo(ctx context.Context, query *login.GetAuthInfoQuery
return authInfo, nil
}
func (s *Service) GetUserLabels(ctx context.Context, query login.GetUserLabelsQuery) (map[int64]string, error) {
// GetUserAuthModuleLabels returns all auth modules for a user ordered by most recent first.
func (s *Service) GetUserAuthModuleLabels(ctx context.Context, userID int64) ([]string, error) {
modules, err := s.authInfoStore.GetUserAuthModules(ctx, userID)
if err != nil {
return nil, err
}
result := make([]string, 0, len(modules))
// modules should be unique and should not contain empty strings
for _, m := range modules {
label := login.GetAuthProviderLabel(m)
result = append(result, label)
}
return result, nil
}
func (s *Service) GetUsersRecentlyUsedLabel(ctx context.Context, query login.GetUserLabelsQuery) (map[int64]string, error) {
if len(query.UserIDs) == 0 {
return map[int64]string{}, nil
}
return s.authInfoStore.GetUserLabels(ctx, query)
return s.authInfoStore.GetUsersRecentlyUsedLabel(ctx, query)
}
func (s *Service) setAuthInfoInCache(ctx context.Context, query *login.GetAuthInfoQuery, info *login.UserAuth) error {

View File

@@ -0,0 +1,31 @@
package authinfoimpl
import (
"context"
"testing"
"github.com/grafana/grafana/pkg/services/login"
"github.com/grafana/grafana/pkg/services/login/authinfotest"
"github.com/stretchr/testify/mock"
"github.com/stretchr/testify/require"
)
func TestAuthInfoService_GetUserAuthModuleLabels(t *testing.T) {
store := authinfotest.NewMockAuthInfoStore(t)
userID := int64(42)
// Input modules from store (order matters, uniqueness assumed)
modules := []string{login.OktaAuthModule, login.LDAPAuthModule, login.SAMLAuthModule}
store.On("GetUserAuthModules", mock.Anything, userID).Return(modules, nil)
svc := ProvideService(store, nil, nil)
actual, err := svc.GetUserAuthModuleLabels(context.Background(), userID)
require.NoError(t, err)
expected := []string{login.GetAuthProviderLabel(login.OktaAuthModule), login.GetAuthProviderLabel(login.LDAPAuthModule), login.GetAuthProviderLabel(login.SAMLAuthModule)}
// Verify labels mapped and order preserved
require.Equal(t, expected, actual)
}

View File

@@ -82,7 +82,7 @@ func (s *Store) GetAuthInfo(ctx context.Context, query *login.GetAuthInfoQuery)
return userAuth, nil
}
func (s *Store) GetUserLabels(ctx context.Context, query login.GetUserLabelsQuery) (map[int64]string, error) {
func (s *Store) GetUsersRecentlyUsedLabel(ctx context.Context, query login.GetUserLabelsQuery) (map[int64]string, error) {
userAuths := []login.UserAuth{}
params := make([]interface{}, 0, len(query.UserIDs))
for _, id := range query.UserIDs {
@@ -105,6 +105,29 @@ func (s *Store) GetUserLabels(ctx context.Context, query login.GetUserLabelsQuer
return labelMap, nil
}
// GetUserAuthModules returns all auth modules a user has used ordered by most recently used first.
func (s *Store) GetUserAuthModules(ctx context.Context, userID int64) ([]string, error) {
rows := make([]struct {
AuthModule string `xorm:"auth_module"`
}, 0)
err := s.sqlStore.WithDbSession(ctx, func(sess *db.Session) error {
return sess.Table("user_auth").Where("user_id = ?", userID).Desc("created").Cols("auth_module").Find(&rows)
})
if err != nil {
return nil, err
}
modules := make([]string, 0, len(rows))
seen := make(map[string]struct{}, len(rows))
for _, r := range rows {
if _, ok := seen[r.AuthModule]; ok {
continue
}
seen[r.AuthModule] = struct{}{}
modules = append(modules, r.AuthModule)
}
return modules, nil
}
func (s *Store) SetAuthInfo(ctx context.Context, cmd *login.SetAuthInfoCommand) error {
authUser := &login.UserAuth{
UserId: cmd.UserId,

View File

@@ -45,7 +45,7 @@ func TestIntegrationAuthInfoStore(t *testing.T) {
UserId: 2,
}))
labels, err := store.GetUserLabels(ctx, login.GetUserLabelsQuery{UserIDs: []int64{1, 2}})
labels, err := store.GetUsersRecentlyUsedLabel(ctx, login.GetUserLabelsQuery{UserIDs: []int64{1, 2}})
require.NoError(t, err)
require.Len(t, labels, 2)

View File

@@ -1,17 +1,163 @@
// Code generated by mockery; DO NOT EDIT.
// github.com/vektra/mockery
// template: testify
// Code generated by mockery v2.53.5. DO NOT EDIT.
package authinfotest
import (
"context"
context "context"
"github.com/grafana/grafana/pkg/services/login"
"github.com/grafana/grafana/pkg/services/user"
login "github.com/grafana/grafana/pkg/services/login"
mock "github.com/stretchr/testify/mock"
)
// MockAuthInfoService is an autogenerated mock type for the AuthInfoService type
type MockAuthInfoService struct {
mock.Mock
}
// DeleteUserAuthInfo provides a mock function with given fields: ctx, userID
func (_m *MockAuthInfoService) DeleteUserAuthInfo(ctx context.Context, userID int64) error {
ret := _m.Called(ctx, userID)
if len(ret) == 0 {
panic("no return value specified for DeleteUserAuthInfo")
}
var r0 error
if rf, ok := ret.Get(0).(func(context.Context, int64) error); ok {
r0 = rf(ctx, userID)
} else {
r0 = ret.Error(0)
}
return r0
}
// GetAuthInfo provides a mock function with given fields: ctx, query
func (_m *MockAuthInfoService) GetAuthInfo(ctx context.Context, query *login.GetAuthInfoQuery) (*login.UserAuth, error) {
ret := _m.Called(ctx, query)
if len(ret) == 0 {
panic("no return value specified for GetAuthInfo")
}
var r0 *login.UserAuth
var r1 error
if rf, ok := ret.Get(0).(func(context.Context, *login.GetAuthInfoQuery) (*login.UserAuth, error)); ok {
return rf(ctx, query)
}
if rf, ok := ret.Get(0).(func(context.Context, *login.GetAuthInfoQuery) *login.UserAuth); ok {
r0 = rf(ctx, query)
} else {
if ret.Get(0) != nil {
r0 = ret.Get(0).(*login.UserAuth)
}
}
if rf, ok := ret.Get(1).(func(context.Context, *login.GetAuthInfoQuery) error); ok {
r1 = rf(ctx, query)
} else {
r1 = ret.Error(1)
}
return r0, r1
}
// GetUserAuthModuleLabels provides a mock function with given fields: ctx, userID
func (_m *MockAuthInfoService) GetUserAuthModuleLabels(ctx context.Context, userID int64) ([]string, error) {
ret := _m.Called(ctx, userID)
if len(ret) == 0 {
panic("no return value specified for GetUserAuthModuleLabels")
}
var r0 []string
var r1 error
if rf, ok := ret.Get(0).(func(context.Context, int64) ([]string, error)); ok {
return rf(ctx, userID)
}
if rf, ok := ret.Get(0).(func(context.Context, int64) []string); ok {
r0 = rf(ctx, userID)
} else {
if ret.Get(0) != nil {
r0 = ret.Get(0).([]string)
}
}
if rf, ok := ret.Get(1).(func(context.Context, int64) error); ok {
r1 = rf(ctx, userID)
} else {
r1 = ret.Error(1)
}
return r0, r1
}
// GetUsersRecentlyUsedLabel provides a mock function with given fields: ctx, query
func (_m *MockAuthInfoService) GetUsersRecentlyUsedLabel(ctx context.Context, query login.GetUserLabelsQuery) (map[int64]string, error) {
ret := _m.Called(ctx, query)
if len(ret) == 0 {
panic("no return value specified for GetUsersRecentlyUsedLabel")
}
var r0 map[int64]string
var r1 error
if rf, ok := ret.Get(0).(func(context.Context, login.GetUserLabelsQuery) (map[int64]string, error)); ok {
return rf(ctx, query)
}
if rf, ok := ret.Get(0).(func(context.Context, login.GetUserLabelsQuery) map[int64]string); ok {
r0 = rf(ctx, query)
} else {
if ret.Get(0) != nil {
r0 = ret.Get(0).(map[int64]string)
}
}
if rf, ok := ret.Get(1).(func(context.Context, login.GetUserLabelsQuery) error); ok {
r1 = rf(ctx, query)
} else {
r1 = ret.Error(1)
}
return r0, r1
}
// SetAuthInfo provides a mock function with given fields: ctx, cmd
func (_m *MockAuthInfoService) SetAuthInfo(ctx context.Context, cmd *login.SetAuthInfoCommand) error {
ret := _m.Called(ctx, cmd)
if len(ret) == 0 {
panic("no return value specified for SetAuthInfo")
}
var r0 error
if rf, ok := ret.Get(0).(func(context.Context, *login.SetAuthInfoCommand) error); ok {
r0 = rf(ctx, cmd)
} else {
r0 = ret.Error(0)
}
return r0
}
// UpdateAuthInfo provides a mock function with given fields: ctx, cmd
func (_m *MockAuthInfoService) UpdateAuthInfo(ctx context.Context, cmd *login.UpdateAuthInfoCommand) error {
ret := _m.Called(ctx, cmd)
if len(ret) == 0 {
panic("no return value specified for UpdateAuthInfo")
}
var r0 error
if rf, ok := ret.Get(0).(func(context.Context, *login.UpdateAuthInfoCommand) error); ok {
r0 = rf(ctx, cmd)
} else {
r0 = ret.Error(0)
}
return r0
}
// NewMockAuthInfoService creates a new instance of MockAuthInfoService. It also registers a testing interface on the mock and a cleanup function to assert the mocks expectations.
// The first argument is typically a *testing.T value.
func NewMockAuthInfoService(t interface {
@@ -25,741 +171,3 @@ func NewMockAuthInfoService(t interface {
return mock
}
// MockAuthInfoService is an autogenerated mock type for the AuthInfoService type
type MockAuthInfoService struct {
mock.Mock
}
type MockAuthInfoService_Expecter struct {
mock *mock.Mock
}
func (_m *MockAuthInfoService) EXPECT() *MockAuthInfoService_Expecter {
return &MockAuthInfoService_Expecter{mock: &_m.Mock}
}
// DeleteUserAuthInfo provides a mock function for the type MockAuthInfoService
func (_mock *MockAuthInfoService) DeleteUserAuthInfo(ctx context.Context, userID int64) error {
ret := _mock.Called(ctx, userID)
if len(ret) == 0 {
panic("no return value specified for DeleteUserAuthInfo")
}
var r0 error
if returnFunc, ok := ret.Get(0).(func(context.Context, int64) error); ok {
r0 = returnFunc(ctx, userID)
} else {
r0 = ret.Error(0)
}
return r0
}
// MockAuthInfoService_DeleteUserAuthInfo_Call is a *mock.Call that shadows Run/Return methods with type explicit version for method 'DeleteUserAuthInfo'
type MockAuthInfoService_DeleteUserAuthInfo_Call struct {
*mock.Call
}
// DeleteUserAuthInfo is a helper method to define mock.On call
// - ctx context.Context
// - userID int64
func (_e *MockAuthInfoService_Expecter) DeleteUserAuthInfo(ctx interface{}, userID interface{}) *MockAuthInfoService_DeleteUserAuthInfo_Call {
return &MockAuthInfoService_DeleteUserAuthInfo_Call{Call: _e.mock.On("DeleteUserAuthInfo", ctx, userID)}
}
func (_c *MockAuthInfoService_DeleteUserAuthInfo_Call) Run(run func(ctx context.Context, userID int64)) *MockAuthInfoService_DeleteUserAuthInfo_Call {
_c.Call.Run(func(args mock.Arguments) {
var arg0 context.Context
if args[0] != nil {
arg0 = args[0].(context.Context)
}
var arg1 int64
if args[1] != nil {
arg1 = args[1].(int64)
}
run(
arg0,
arg1,
)
})
return _c
}
func (_c *MockAuthInfoService_DeleteUserAuthInfo_Call) Return(err error) *MockAuthInfoService_DeleteUserAuthInfo_Call {
_c.Call.Return(err)
return _c
}
func (_c *MockAuthInfoService_DeleteUserAuthInfo_Call) RunAndReturn(run func(ctx context.Context, userID int64) error) *MockAuthInfoService_DeleteUserAuthInfo_Call {
_c.Call.Return(run)
return _c
}
// GetAuthInfo provides a mock function for the type MockAuthInfoService
func (_mock *MockAuthInfoService) GetAuthInfo(ctx context.Context, query *login.GetAuthInfoQuery) (*login.UserAuth, error) {
ret := _mock.Called(ctx, query)
if len(ret) == 0 {
panic("no return value specified for GetAuthInfo")
}
var r0 *login.UserAuth
var r1 error
if returnFunc, ok := ret.Get(0).(func(context.Context, *login.GetAuthInfoQuery) (*login.UserAuth, error)); ok {
return returnFunc(ctx, query)
}
if returnFunc, ok := ret.Get(0).(func(context.Context, *login.GetAuthInfoQuery) *login.UserAuth); ok {
r0 = returnFunc(ctx, query)
} else {
if ret.Get(0) != nil {
r0 = ret.Get(0).(*login.UserAuth)
}
}
if returnFunc, ok := ret.Get(1).(func(context.Context, *login.GetAuthInfoQuery) error); ok {
r1 = returnFunc(ctx, query)
} else {
r1 = ret.Error(1)
}
return r0, r1
}
// MockAuthInfoService_GetAuthInfo_Call is a *mock.Call that shadows Run/Return methods with type explicit version for method 'GetAuthInfo'
type MockAuthInfoService_GetAuthInfo_Call struct {
*mock.Call
}
// GetAuthInfo is a helper method to define mock.On call
// - ctx context.Context
// - query *login.GetAuthInfoQuery
func (_e *MockAuthInfoService_Expecter) GetAuthInfo(ctx interface{}, query interface{}) *MockAuthInfoService_GetAuthInfo_Call {
return &MockAuthInfoService_GetAuthInfo_Call{Call: _e.mock.On("GetAuthInfo", ctx, query)}
}
func (_c *MockAuthInfoService_GetAuthInfo_Call) Run(run func(ctx context.Context, query *login.GetAuthInfoQuery)) *MockAuthInfoService_GetAuthInfo_Call {
_c.Call.Run(func(args mock.Arguments) {
var arg0 context.Context
if args[0] != nil {
arg0 = args[0].(context.Context)
}
var arg1 *login.GetAuthInfoQuery
if args[1] != nil {
arg1 = args[1].(*login.GetAuthInfoQuery)
}
run(
arg0,
arg1,
)
})
return _c
}
func (_c *MockAuthInfoService_GetAuthInfo_Call) Return(userAuth *login.UserAuth, err error) *MockAuthInfoService_GetAuthInfo_Call {
_c.Call.Return(userAuth, err)
return _c
}
func (_c *MockAuthInfoService_GetAuthInfo_Call) RunAndReturn(run func(ctx context.Context, query *login.GetAuthInfoQuery) (*login.UserAuth, error)) *MockAuthInfoService_GetAuthInfo_Call {
_c.Call.Return(run)
return _c
}
// GetUserLabels provides a mock function for the type MockAuthInfoService
func (_mock *MockAuthInfoService) GetUserLabels(ctx context.Context, query login.GetUserLabelsQuery) (map[int64]string, error) {
ret := _mock.Called(ctx, query)
if len(ret) == 0 {
panic("no return value specified for GetUserLabels")
}
var r0 map[int64]string
var r1 error
if returnFunc, ok := ret.Get(0).(func(context.Context, login.GetUserLabelsQuery) (map[int64]string, error)); ok {
return returnFunc(ctx, query)
}
if returnFunc, ok := ret.Get(0).(func(context.Context, login.GetUserLabelsQuery) map[int64]string); ok {
r0 = returnFunc(ctx, query)
} else {
if ret.Get(0) != nil {
r0 = ret.Get(0).(map[int64]string)
}
}
if returnFunc, ok := ret.Get(1).(func(context.Context, login.GetUserLabelsQuery) error); ok {
r1 = returnFunc(ctx, query)
} else {
r1 = ret.Error(1)
}
return r0, r1
}
// MockAuthInfoService_GetUserLabels_Call is a *mock.Call that shadows Run/Return methods with type explicit version for method 'GetUserLabels'
type MockAuthInfoService_GetUserLabels_Call struct {
*mock.Call
}
// GetUserLabels is a helper method to define mock.On call
// - ctx context.Context
// - query login.GetUserLabelsQuery
func (_e *MockAuthInfoService_Expecter) GetUserLabels(ctx interface{}, query interface{}) *MockAuthInfoService_GetUserLabels_Call {
return &MockAuthInfoService_GetUserLabels_Call{Call: _e.mock.On("GetUserLabels", ctx, query)}
}
func (_c *MockAuthInfoService_GetUserLabels_Call) Run(run func(ctx context.Context, query login.GetUserLabelsQuery)) *MockAuthInfoService_GetUserLabels_Call {
_c.Call.Run(func(args mock.Arguments) {
var arg0 context.Context
if args[0] != nil {
arg0 = args[0].(context.Context)
}
var arg1 login.GetUserLabelsQuery
if args[1] != nil {
arg1 = args[1].(login.GetUserLabelsQuery)
}
run(
arg0,
arg1,
)
})
return _c
}
func (_c *MockAuthInfoService_GetUserLabels_Call) Return(int64ToString map[int64]string, err error) *MockAuthInfoService_GetUserLabels_Call {
_c.Call.Return(int64ToString, err)
return _c
}
func (_c *MockAuthInfoService_GetUserLabels_Call) RunAndReturn(run func(ctx context.Context, query login.GetUserLabelsQuery) (map[int64]string, error)) *MockAuthInfoService_GetUserLabels_Call {
_c.Call.Return(run)
return _c
}
// SetAuthInfo provides a mock function for the type MockAuthInfoService
func (_mock *MockAuthInfoService) SetAuthInfo(ctx context.Context, cmd *login.SetAuthInfoCommand) error {
ret := _mock.Called(ctx, cmd)
if len(ret) == 0 {
panic("no return value specified for SetAuthInfo")
}
var r0 error
if returnFunc, ok := ret.Get(0).(func(context.Context, *login.SetAuthInfoCommand) error); ok {
r0 = returnFunc(ctx, cmd)
} else {
r0 = ret.Error(0)
}
return r0
}
// MockAuthInfoService_SetAuthInfo_Call is a *mock.Call that shadows Run/Return methods with type explicit version for method 'SetAuthInfo'
type MockAuthInfoService_SetAuthInfo_Call struct {
*mock.Call
}
// SetAuthInfo is a helper method to define mock.On call
// - ctx context.Context
// - cmd *login.SetAuthInfoCommand
func (_e *MockAuthInfoService_Expecter) SetAuthInfo(ctx interface{}, cmd interface{}) *MockAuthInfoService_SetAuthInfo_Call {
return &MockAuthInfoService_SetAuthInfo_Call{Call: _e.mock.On("SetAuthInfo", ctx, cmd)}
}
func (_c *MockAuthInfoService_SetAuthInfo_Call) Run(run func(ctx context.Context, cmd *login.SetAuthInfoCommand)) *MockAuthInfoService_SetAuthInfo_Call {
_c.Call.Run(func(args mock.Arguments) {
var arg0 context.Context
if args[0] != nil {
arg0 = args[0].(context.Context)
}
var arg1 *login.SetAuthInfoCommand
if args[1] != nil {
arg1 = args[1].(*login.SetAuthInfoCommand)
}
run(
arg0,
arg1,
)
})
return _c
}
func (_c *MockAuthInfoService_SetAuthInfo_Call) Return(err error) *MockAuthInfoService_SetAuthInfo_Call {
_c.Call.Return(err)
return _c
}
func (_c *MockAuthInfoService_SetAuthInfo_Call) RunAndReturn(run func(ctx context.Context, cmd *login.SetAuthInfoCommand) error) *MockAuthInfoService_SetAuthInfo_Call {
_c.Call.Return(run)
return _c
}
// UpdateAuthInfo provides a mock function for the type MockAuthInfoService
func (_mock *MockAuthInfoService) UpdateAuthInfo(ctx context.Context, cmd *login.UpdateAuthInfoCommand) error {
ret := _mock.Called(ctx, cmd)
if len(ret) == 0 {
panic("no return value specified for UpdateAuthInfo")
}
var r0 error
if returnFunc, ok := ret.Get(0).(func(context.Context, *login.UpdateAuthInfoCommand) error); ok {
r0 = returnFunc(ctx, cmd)
} else {
r0 = ret.Error(0)
}
return r0
}
// MockAuthInfoService_UpdateAuthInfo_Call is a *mock.Call that shadows Run/Return methods with type explicit version for method 'UpdateAuthInfo'
type MockAuthInfoService_UpdateAuthInfo_Call struct {
*mock.Call
}
// UpdateAuthInfo is a helper method to define mock.On call
// - ctx context.Context
// - cmd *login.UpdateAuthInfoCommand
func (_e *MockAuthInfoService_Expecter) UpdateAuthInfo(ctx interface{}, cmd interface{}) *MockAuthInfoService_UpdateAuthInfo_Call {
return &MockAuthInfoService_UpdateAuthInfo_Call{Call: _e.mock.On("UpdateAuthInfo", ctx, cmd)}
}
func (_c *MockAuthInfoService_UpdateAuthInfo_Call) Run(run func(ctx context.Context, cmd *login.UpdateAuthInfoCommand)) *MockAuthInfoService_UpdateAuthInfo_Call {
_c.Call.Run(func(args mock.Arguments) {
var arg0 context.Context
if args[0] != nil {
arg0 = args[0].(context.Context)
}
var arg1 *login.UpdateAuthInfoCommand
if args[1] != nil {
arg1 = args[1].(*login.UpdateAuthInfoCommand)
}
run(
arg0,
arg1,
)
})
return _c
}
func (_c *MockAuthInfoService_UpdateAuthInfo_Call) Return(err error) *MockAuthInfoService_UpdateAuthInfo_Call {
_c.Call.Return(err)
return _c
}
func (_c *MockAuthInfoService_UpdateAuthInfo_Call) RunAndReturn(run func(ctx context.Context, cmd *login.UpdateAuthInfoCommand) error) *MockAuthInfoService_UpdateAuthInfo_Call {
_c.Call.Return(run)
return _c
}
// NewMockStore creates a new instance of MockStore. It also registers a testing interface on the mock and a cleanup function to assert the mocks expectations.
// The first argument is typically a *testing.T value.
func NewMockStore(t interface {
mock.TestingT
Cleanup(func())
}) *MockStore {
mock := &MockStore{}
mock.Mock.Test(t)
t.Cleanup(func() { mock.AssertExpectations(t) })
return mock
}
// MockStore is an autogenerated mock type for the Store type
type MockStore struct {
mock.Mock
}
type MockStore_Expecter struct {
mock *mock.Mock
}
func (_m *MockStore) EXPECT() *MockStore_Expecter {
return &MockStore_Expecter{mock: &_m.Mock}
}
// DeleteUserAuthInfo provides a mock function for the type MockStore
func (_mock *MockStore) DeleteUserAuthInfo(ctx context.Context, userID int64) error {
ret := _mock.Called(ctx, userID)
if len(ret) == 0 {
panic("no return value specified for DeleteUserAuthInfo")
}
var r0 error
if returnFunc, ok := ret.Get(0).(func(context.Context, int64) error); ok {
r0 = returnFunc(ctx, userID)
} else {
r0 = ret.Error(0)
}
return r0
}
// MockStore_DeleteUserAuthInfo_Call is a *mock.Call that shadows Run/Return methods with type explicit version for method 'DeleteUserAuthInfo'
type MockStore_DeleteUserAuthInfo_Call struct {
*mock.Call
}
// DeleteUserAuthInfo is a helper method to define mock.On call
// - ctx context.Context
// - userID int64
func (_e *MockStore_Expecter) DeleteUserAuthInfo(ctx interface{}, userID interface{}) *MockStore_DeleteUserAuthInfo_Call {
return &MockStore_DeleteUserAuthInfo_Call{Call: _e.mock.On("DeleteUserAuthInfo", ctx, userID)}
}
func (_c *MockStore_DeleteUserAuthInfo_Call) Run(run func(ctx context.Context, userID int64)) *MockStore_DeleteUserAuthInfo_Call {
_c.Call.Run(func(args mock.Arguments) {
var arg0 context.Context
if args[0] != nil {
arg0 = args[0].(context.Context)
}
var arg1 int64
if args[1] != nil {
arg1 = args[1].(int64)
}
run(
arg0,
arg1,
)
})
return _c
}
func (_c *MockStore_DeleteUserAuthInfo_Call) Return(err error) *MockStore_DeleteUserAuthInfo_Call {
_c.Call.Return(err)
return _c
}
func (_c *MockStore_DeleteUserAuthInfo_Call) RunAndReturn(run func(ctx context.Context, userID int64) error) *MockStore_DeleteUserAuthInfo_Call {
_c.Call.Return(run)
return _c
}
// GetAuthInfo provides a mock function for the type MockStore
func (_mock *MockStore) GetAuthInfo(ctx context.Context, query *login.GetAuthInfoQuery) (*login.UserAuth, error) {
ret := _mock.Called(ctx, query)
if len(ret) == 0 {
panic("no return value specified for GetAuthInfo")
}
var r0 *login.UserAuth
var r1 error
if returnFunc, ok := ret.Get(0).(func(context.Context, *login.GetAuthInfoQuery) (*login.UserAuth, error)); ok {
return returnFunc(ctx, query)
}
if returnFunc, ok := ret.Get(0).(func(context.Context, *login.GetAuthInfoQuery) *login.UserAuth); ok {
r0 = returnFunc(ctx, query)
} else {
if ret.Get(0) != nil {
r0 = ret.Get(0).(*login.UserAuth)
}
}
if returnFunc, ok := ret.Get(1).(func(context.Context, *login.GetAuthInfoQuery) error); ok {
r1 = returnFunc(ctx, query)
} else {
r1 = ret.Error(1)
}
return r0, r1
}
// MockStore_GetAuthInfo_Call is a *mock.Call that shadows Run/Return methods with type explicit version for method 'GetAuthInfo'
type MockStore_GetAuthInfo_Call struct {
*mock.Call
}
// GetAuthInfo is a helper method to define mock.On call
// - ctx context.Context
// - query *login.GetAuthInfoQuery
func (_e *MockStore_Expecter) GetAuthInfo(ctx interface{}, query interface{}) *MockStore_GetAuthInfo_Call {
return &MockStore_GetAuthInfo_Call{Call: _e.mock.On("GetAuthInfo", ctx, query)}
}
func (_c *MockStore_GetAuthInfo_Call) Run(run func(ctx context.Context, query *login.GetAuthInfoQuery)) *MockStore_GetAuthInfo_Call {
_c.Call.Run(func(args mock.Arguments) {
var arg0 context.Context
if args[0] != nil {
arg0 = args[0].(context.Context)
}
var arg1 *login.GetAuthInfoQuery
if args[1] != nil {
arg1 = args[1].(*login.GetAuthInfoQuery)
}
run(
arg0,
arg1,
)
})
return _c
}
func (_c *MockStore_GetAuthInfo_Call) Return(userAuth *login.UserAuth, err error) *MockStore_GetAuthInfo_Call {
_c.Call.Return(userAuth, err)
return _c
}
func (_c *MockStore_GetAuthInfo_Call) RunAndReturn(run func(ctx context.Context, query *login.GetAuthInfoQuery) (*login.UserAuth, error)) *MockStore_GetAuthInfo_Call {
_c.Call.Return(run)
return _c
}
// GetUserLabels provides a mock function for the type MockStore
func (_mock *MockStore) GetUserLabels(ctx context.Context, query login.GetUserLabelsQuery) (map[int64]string, error) {
ret := _mock.Called(ctx, query)
if len(ret) == 0 {
panic("no return value specified for GetUserLabels")
}
var r0 map[int64]string
var r1 error
if returnFunc, ok := ret.Get(0).(func(context.Context, login.GetUserLabelsQuery) (map[int64]string, error)); ok {
return returnFunc(ctx, query)
}
if returnFunc, ok := ret.Get(0).(func(context.Context, login.GetUserLabelsQuery) map[int64]string); ok {
r0 = returnFunc(ctx, query)
} else {
if ret.Get(0) != nil {
r0 = ret.Get(0).(map[int64]string)
}
}
if returnFunc, ok := ret.Get(1).(func(context.Context, login.GetUserLabelsQuery) error); ok {
r1 = returnFunc(ctx, query)
} else {
r1 = ret.Error(1)
}
return r0, r1
}
// MockStore_GetUserLabels_Call is a *mock.Call that shadows Run/Return methods with type explicit version for method 'GetUserLabels'
type MockStore_GetUserLabels_Call struct {
*mock.Call
}
// GetUserLabels is a helper method to define mock.On call
// - ctx context.Context
// - query login.GetUserLabelsQuery
func (_e *MockStore_Expecter) GetUserLabels(ctx interface{}, query interface{}) *MockStore_GetUserLabels_Call {
return &MockStore_GetUserLabels_Call{Call: _e.mock.On("GetUserLabels", ctx, query)}
}
func (_c *MockStore_GetUserLabels_Call) Run(run func(ctx context.Context, query login.GetUserLabelsQuery)) *MockStore_GetUserLabels_Call {
_c.Call.Run(func(args mock.Arguments) {
var arg0 context.Context
if args[0] != nil {
arg0 = args[0].(context.Context)
}
var arg1 login.GetUserLabelsQuery
if args[1] != nil {
arg1 = args[1].(login.GetUserLabelsQuery)
}
run(
arg0,
arg1,
)
})
return _c
}
func (_c *MockStore_GetUserLabels_Call) Return(int64ToString map[int64]string, err error) *MockStore_GetUserLabels_Call {
_c.Call.Return(int64ToString, err)
return _c
}
func (_c *MockStore_GetUserLabels_Call) RunAndReturn(run func(ctx context.Context, query login.GetUserLabelsQuery) (map[int64]string, error)) *MockStore_GetUserLabels_Call {
_c.Call.Return(run)
return _c
}
// SetAuthInfo provides a mock function for the type MockStore
func (_mock *MockStore) SetAuthInfo(ctx context.Context, cmd *login.SetAuthInfoCommand) error {
ret := _mock.Called(ctx, cmd)
if len(ret) == 0 {
panic("no return value specified for SetAuthInfo")
}
var r0 error
if returnFunc, ok := ret.Get(0).(func(context.Context, *login.SetAuthInfoCommand) error); ok {
r0 = returnFunc(ctx, cmd)
} else {
r0 = ret.Error(0)
}
return r0
}
// MockStore_SetAuthInfo_Call is a *mock.Call that shadows Run/Return methods with type explicit version for method 'SetAuthInfo'
type MockStore_SetAuthInfo_Call struct {
*mock.Call
}
// SetAuthInfo is a helper method to define mock.On call
// - ctx context.Context
// - cmd *login.SetAuthInfoCommand
func (_e *MockStore_Expecter) SetAuthInfo(ctx interface{}, cmd interface{}) *MockStore_SetAuthInfo_Call {
return &MockStore_SetAuthInfo_Call{Call: _e.mock.On("SetAuthInfo", ctx, cmd)}
}
func (_c *MockStore_SetAuthInfo_Call) Run(run func(ctx context.Context, cmd *login.SetAuthInfoCommand)) *MockStore_SetAuthInfo_Call {
_c.Call.Run(func(args mock.Arguments) {
var arg0 context.Context
if args[0] != nil {
arg0 = args[0].(context.Context)
}
var arg1 *login.SetAuthInfoCommand
if args[1] != nil {
arg1 = args[1].(*login.SetAuthInfoCommand)
}
run(
arg0,
arg1,
)
})
return _c
}
func (_c *MockStore_SetAuthInfo_Call) Return(err error) *MockStore_SetAuthInfo_Call {
_c.Call.Return(err)
return _c
}
func (_c *MockStore_SetAuthInfo_Call) RunAndReturn(run func(ctx context.Context, cmd *login.SetAuthInfoCommand) error) *MockStore_SetAuthInfo_Call {
_c.Call.Return(run)
return _c
}
// UpdateAuthInfo provides a mock function for the type MockStore
func (_mock *MockStore) UpdateAuthInfo(ctx context.Context, cmd *login.UpdateAuthInfoCommand) error {
ret := _mock.Called(ctx, cmd)
if len(ret) == 0 {
panic("no return value specified for UpdateAuthInfo")
}
var r0 error
if returnFunc, ok := ret.Get(0).(func(context.Context, *login.UpdateAuthInfoCommand) error); ok {
r0 = returnFunc(ctx, cmd)
} else {
r0 = ret.Error(0)
}
return r0
}
// MockStore_UpdateAuthInfo_Call is a *mock.Call that shadows Run/Return methods with type explicit version for method 'UpdateAuthInfo'
type MockStore_UpdateAuthInfo_Call struct {
*mock.Call
}
// UpdateAuthInfo is a helper method to define mock.On call
// - ctx context.Context
// - cmd *login.UpdateAuthInfoCommand
func (_e *MockStore_Expecter) UpdateAuthInfo(ctx interface{}, cmd interface{}) *MockStore_UpdateAuthInfo_Call {
return &MockStore_UpdateAuthInfo_Call{Call: _e.mock.On("UpdateAuthInfo", ctx, cmd)}
}
func (_c *MockStore_UpdateAuthInfo_Call) Run(run func(ctx context.Context, cmd *login.UpdateAuthInfoCommand)) *MockStore_UpdateAuthInfo_Call {
_c.Call.Run(func(args mock.Arguments) {
var arg0 context.Context
if args[0] != nil {
arg0 = args[0].(context.Context)
}
var arg1 *login.UpdateAuthInfoCommand
if args[1] != nil {
arg1 = args[1].(*login.UpdateAuthInfoCommand)
}
run(
arg0,
arg1,
)
})
return _c
}
func (_c *MockStore_UpdateAuthInfo_Call) Return(err error) *MockStore_UpdateAuthInfo_Call {
_c.Call.Return(err)
return _c
}
func (_c *MockStore_UpdateAuthInfo_Call) RunAndReturn(run func(ctx context.Context, cmd *login.UpdateAuthInfoCommand) error) *MockStore_UpdateAuthInfo_Call {
_c.Call.Return(run)
return _c
}
// NewMockUserProtectionService creates a new instance of MockUserProtectionService. It also registers a testing interface on the mock and a cleanup function to assert the mocks expectations.
// The first argument is typically a *testing.T value.
func NewMockUserProtectionService(t interface {
mock.TestingT
Cleanup(func())
}) *MockUserProtectionService {
mock := &MockUserProtectionService{}
mock.Mock.Test(t)
t.Cleanup(func() { mock.AssertExpectations(t) })
return mock
}
// MockUserProtectionService is an autogenerated mock type for the UserProtectionService type
type MockUserProtectionService struct {
mock.Mock
}
type MockUserProtectionService_Expecter struct {
mock *mock.Mock
}
func (_m *MockUserProtectionService) EXPECT() *MockUserProtectionService_Expecter {
return &MockUserProtectionService_Expecter{mock: &_m.Mock}
}
// AllowUserMapping provides a mock function for the type MockUserProtectionService
func (_mock *MockUserProtectionService) AllowUserMapping(user1 *user.User, authModule string) error {
ret := _mock.Called(user1, authModule)
if len(ret) == 0 {
panic("no return value specified for AllowUserMapping")
}
var r0 error
if returnFunc, ok := ret.Get(0).(func(*user.User, string) error); ok {
r0 = returnFunc(user1, authModule)
} else {
r0 = ret.Error(0)
}
return r0
}
// MockUserProtectionService_AllowUserMapping_Call is a *mock.Call that shadows Run/Return methods with type explicit version for method 'AllowUserMapping'
type MockUserProtectionService_AllowUserMapping_Call struct {
*mock.Call
}
// AllowUserMapping is a helper method to define mock.On call
// - user1 *user.User
// - authModule string
func (_e *MockUserProtectionService_Expecter) AllowUserMapping(user1 interface{}, authModule interface{}) *MockUserProtectionService_AllowUserMapping_Call {
return &MockUserProtectionService_AllowUserMapping_Call{Call: _e.mock.On("AllowUserMapping", user1, authModule)}
}
func (_c *MockUserProtectionService_AllowUserMapping_Call) Run(run func(user1 *user.User, authModule string)) *MockUserProtectionService_AllowUserMapping_Call {
_c.Call.Run(func(args mock.Arguments) {
var arg0 *user.User
if args[0] != nil {
arg0 = args[0].(*user.User)
}
var arg1 string
if args[1] != nil {
arg1 = args[1].(string)
}
run(
arg0,
arg1,
)
})
return _c
}
func (_c *MockUserProtectionService_AllowUserMapping_Call) Return(err error) *MockUserProtectionService_AllowUserMapping_Call {
_c.Call.Return(err)
return _c
}
func (_c *MockUserProtectionService_AllowUserMapping_Call) RunAndReturn(run func(user1 *user.User, authModule string) error) *MockUserProtectionService_AllowUserMapping_Call {
_c.Call.Return(run)
return _c
}

View File

@@ -0,0 +1,173 @@
// Code generated by mockery v2.53.5. DO NOT EDIT.
package authinfotest
import (
context "context"
login "github.com/grafana/grafana/pkg/services/login"
mock "github.com/stretchr/testify/mock"
)
// MockAuthInfoStore is an autogenerated mock type for the Store type
type MockAuthInfoStore struct {
mock.Mock
}
// DeleteUserAuthInfo provides a mock function with given fields: ctx, userID
func (_m *MockAuthInfoStore) DeleteUserAuthInfo(ctx context.Context, userID int64) error {
ret := _m.Called(ctx, userID)
if len(ret) == 0 {
panic("no return value specified for DeleteUserAuthInfo")
}
var r0 error
if rf, ok := ret.Get(0).(func(context.Context, int64) error); ok {
r0 = rf(ctx, userID)
} else {
r0 = ret.Error(0)
}
return r0
}
// GetAuthInfo provides a mock function with given fields: ctx, query
func (_m *MockAuthInfoStore) GetAuthInfo(ctx context.Context, query *login.GetAuthInfoQuery) (*login.UserAuth, error) {
ret := _m.Called(ctx, query)
if len(ret) == 0 {
panic("no return value specified for GetAuthInfo")
}
var r0 *login.UserAuth
var r1 error
if rf, ok := ret.Get(0).(func(context.Context, *login.GetAuthInfoQuery) (*login.UserAuth, error)); ok {
return rf(ctx, query)
}
if rf, ok := ret.Get(0).(func(context.Context, *login.GetAuthInfoQuery) *login.UserAuth); ok {
r0 = rf(ctx, query)
} else {
if ret.Get(0) != nil {
r0 = ret.Get(0).(*login.UserAuth)
}
}
if rf, ok := ret.Get(1).(func(context.Context, *login.GetAuthInfoQuery) error); ok {
r1 = rf(ctx, query)
} else {
r1 = ret.Error(1)
}
return r0, r1
}
// GetUserAuthModules provides a mock function with given fields: ctx, userID
func (_m *MockAuthInfoStore) GetUserAuthModules(ctx context.Context, userID int64) ([]string, error) {
ret := _m.Called(ctx, userID)
if len(ret) == 0 {
panic("no return value specified for GetUserAuthModules")
}
var r0 []string
var r1 error
if rf, ok := ret.Get(0).(func(context.Context, int64) ([]string, error)); ok {
return rf(ctx, userID)
}
if rf, ok := ret.Get(0).(func(context.Context, int64) []string); ok {
r0 = rf(ctx, userID)
} else {
if ret.Get(0) != nil {
r0 = ret.Get(0).([]string)
}
}
if rf, ok := ret.Get(1).(func(context.Context, int64) error); ok {
r1 = rf(ctx, userID)
} else {
r1 = ret.Error(1)
}
return r0, r1
}
// GetUsersRecentlyUsedLabel provides a mock function with given fields: ctx, query
func (_m *MockAuthInfoStore) GetUsersRecentlyUsedLabel(ctx context.Context, query login.GetUserLabelsQuery) (map[int64]string, error) {
ret := _m.Called(ctx, query)
if len(ret) == 0 {
panic("no return value specified for GetUsersRecentlyUsedLabel")
}
var r0 map[int64]string
var r1 error
if rf, ok := ret.Get(0).(func(context.Context, login.GetUserLabelsQuery) (map[int64]string, error)); ok {
return rf(ctx, query)
}
if rf, ok := ret.Get(0).(func(context.Context, login.GetUserLabelsQuery) map[int64]string); ok {
r0 = rf(ctx, query)
} else {
if ret.Get(0) != nil {
r0 = ret.Get(0).(map[int64]string)
}
}
if rf, ok := ret.Get(1).(func(context.Context, login.GetUserLabelsQuery) error); ok {
r1 = rf(ctx, query)
} else {
r1 = ret.Error(1)
}
return r0, r1
}
// SetAuthInfo provides a mock function with given fields: ctx, cmd
func (_m *MockAuthInfoStore) SetAuthInfo(ctx context.Context, cmd *login.SetAuthInfoCommand) error {
ret := _m.Called(ctx, cmd)
if len(ret) == 0 {
panic("no return value specified for SetAuthInfo")
}
var r0 error
if rf, ok := ret.Get(0).(func(context.Context, *login.SetAuthInfoCommand) error); ok {
r0 = rf(ctx, cmd)
} else {
r0 = ret.Error(0)
}
return r0
}
// UpdateAuthInfo provides a mock function with given fields: ctx, cmd
func (_m *MockAuthInfoStore) UpdateAuthInfo(ctx context.Context, cmd *login.UpdateAuthInfoCommand) error {
ret := _m.Called(ctx, cmd)
if len(ret) == 0 {
panic("no return value specified for UpdateAuthInfo")
}
var r0 error
if rf, ok := ret.Get(0).(func(context.Context, *login.UpdateAuthInfoCommand) error); ok {
r0 = rf(ctx, cmd)
} else {
r0 = ret.Error(0)
}
return r0
}
// NewMockAuthInfoStore creates a new instance of MockAuthInfoStore. It also registers a testing interface on the mock and a cleanup function to assert the mocks expectations.
// The first argument is typically a *testing.T value.
func NewMockAuthInfoStore(t interface {
mock.TestingT
Cleanup(func())
}) *MockAuthInfoStore {
mock := &MockAuthInfoStore{}
mock.Mock.Test(t)
t.Cleanup(func() { mock.AssertExpectations(t) })
return mock
}

View File

@@ -8,11 +8,12 @@ import (
type FakeService struct {
login.AuthInfoService
LatestUserID int64
ExpectedUserAuth *login.UserAuth
ExpectedExternalUser *login.ExternalUserInfo
ExpectedError error
ExpectedLabels map[int64]string
LatestUserID int64
ExpectedUserAuth *login.UserAuth
ExpectedExternalUser *login.ExternalUserInfo
ExpectedError error
ExpectedRecentlyUsedLabel map[int64]string
ExpectedAuthModuleLabels []string
SetAuthInfoFn func(ctx context.Context, cmd *login.SetAuthInfoCommand) error
UpdateAuthInfoFn func(ctx context.Context, cmd *login.UpdateAuthInfoCommand) error
@@ -24,8 +25,12 @@ func (a *FakeService) GetAuthInfo(ctx context.Context, query *login.GetAuthInfoQ
return a.ExpectedUserAuth, a.ExpectedError
}
func (a *FakeService) GetUserLabels(ctx context.Context, query login.GetUserLabelsQuery) (map[int64]string, error) {
return a.ExpectedLabels, a.ExpectedError
func (a *FakeService) GetUsersRecentlyUsedLabel(ctx context.Context, query login.GetUserLabelsQuery) (map[int64]string, error) {
return a.ExpectedRecentlyUsedLabel, a.ExpectedError
}
func (a *FakeService) GetUserAuthModuleLabels(ctx context.Context, userID int64) ([]string, error) {
return a.ExpectedAuthModuleLabels, a.ExpectedError
}
func (a *FakeService) SetAuthInfo(ctx context.Context, cmd *login.SetAuthInfoCommand) error {

View File

@@ -461,7 +461,7 @@ func TestIntegrationCRUD(t *testing.T) {
}
created, err := adminClient.Create(ctx, alertRule, v1.CreateOptions{})
require.ErrorContains(t, err, "invalid alert rule")
require.ErrorContains(t, err, "trigger interval must be a multiple of base evaluation interval")
require.Nil(t, created)
})
}
@@ -564,3 +564,148 @@ func TestIntegrationBasicAPI(t *testing.T) {
t.Logf("Got error: %s", err)
})
}
func TestIntegrationFolderLabelSyncAndValidation(t *testing.T) {
testutil.SkipIntegrationTestInShortMode(t)
ctx := context.Background()
helper := common.GetTestHelper(t)
client := common.NewAlertRuleClient(t, helper.Org1.Admin)
// Prepare two folders for label sync update scenario
common.CreateTestFolder(t, helper, "test-folder-a")
common.CreateTestFolder(t, helper, "test-folder-b")
baseGen := ngmodels.RuleGen.With(
ngmodels.RuleMuts.WithUniqueUID(),
ngmodels.RuleMuts.WithUniqueTitle(),
ngmodels.RuleMuts.WithNamespaceUID("test-folder-a"),
ngmodels.RuleMuts.WithGroupName("test-group"),
ngmodels.RuleMuts.WithIntervalMatching(time.Duration(10)*time.Second),
)
t.Run("should keep folder label in sync with folder annotation on create and update", func(t *testing.T) {
rule := baseGen.Generate()
alertRule := &v0alpha1.AlertRule{
ObjectMeta: v1.ObjectMeta{
Namespace: "default",
Annotations: map[string]string{
v0alpha1.FolderAnnotationKey: "test-folder-a",
},
},
Spec: v0alpha1.AlertRuleSpec{
Title: rule.Title,
Expressions: v0alpha1.AlertRuleExpressionMap{
"A": {
QueryType: util.Pointer(rule.Data[0].QueryType),
DatasourceUID: util.Pointer(v0alpha1.AlertRuleDatasourceUID(rule.Data[0].DatasourceUID)),
Model: rule.Data[0].Model,
Source: util.Pointer(true),
RelativeTimeRange: &v0alpha1.AlertRuleRelativeTimeRange{
From: v0alpha1.AlertRulePromDurationWMillis("5m"),
To: v0alpha1.AlertRulePromDurationWMillis("0s"),
},
},
},
Trigger: v0alpha1.AlertRuleIntervalTrigger{
Interval: v0alpha1.AlertRulePromDuration(fmt.Sprintf("%ds", rule.IntervalSeconds)),
},
NoDataState: string(rule.NoDataState),
ExecErrState: string(rule.ExecErrState),
},
}
created, err := client.Create(ctx, alertRule, v1.CreateOptions{})
require.NoError(t, err)
defer func() { _ = client.Delete(ctx, created.Name, v1.DeleteOptions{}) }()
// On create, metadata.labels[v0alpha1.FolderLabelKey] should mirror annotation
require.Equal(t, "test-folder-a", created.Labels[v0alpha1.FolderLabelKey])
// Update annotation to point to a different folder and ensure label follows
updated := created.Copy().(*v0alpha1.AlertRule)
if updated.Annotations == nil {
updated.Annotations = map[string]string{}
}
updated.Annotations[v0alpha1.FolderAnnotationKey] = "test-folder-b"
after, err := client.Update(ctx, updated, v1.UpdateOptions{})
require.NoError(t, err)
require.Equal(t, "test-folder-b", after.Annotations[v0alpha1.FolderAnnotationKey])
require.Equal(t, "test-folder-b", after.Labels[v0alpha1.FolderLabelKey])
})
t.Run("should fail to create rule without folder annotation", func(t *testing.T) {
rule := baseGen.Generate()
alertRule := &v0alpha1.AlertRule{
ObjectMeta: v1.ObjectMeta{
Namespace: "default",
Annotations: map[string]string{}, // missing grafana.app/folder
},
Spec: v0alpha1.AlertRuleSpec{
Title: rule.Title,
Expressions: v0alpha1.AlertRuleExpressionMap{
"A": {
QueryType: util.Pointer(rule.Data[0].QueryType),
DatasourceUID: util.Pointer(v0alpha1.AlertRuleDatasourceUID(rule.Data[0].DatasourceUID)),
Model: rule.Data[0].Model,
Source: util.Pointer(true),
RelativeTimeRange: &v0alpha1.AlertRuleRelativeTimeRange{
From: v0alpha1.AlertRulePromDurationWMillis("5m"),
To: v0alpha1.AlertRulePromDurationWMillis("0s"),
},
},
},
Trigger: v0alpha1.AlertRuleIntervalTrigger{
Interval: v0alpha1.AlertRulePromDuration("10s"),
},
NoDataState: "NoData",
ExecErrState: "Error",
},
}
created, err := client.Create(ctx, alertRule, v1.CreateOptions{})
require.Error(t, err)
require.Nil(t, created)
})
t.Run("should fail to create rule with group labels preset", func(t *testing.T) {
rule := baseGen.Generate()
alertRule := &v0alpha1.AlertRule{
ObjectMeta: v1.ObjectMeta{
Namespace: "default",
Annotations: map[string]string{
v0alpha1.FolderAnnotationKey: "test-folder-a",
},
Labels: map[string]string{
v0alpha1.GroupLabelKey: "some-group",
v0alpha1.GroupIndexLabelKey: "0",
},
},
Spec: v0alpha1.AlertRuleSpec{
Title: rule.Title,
Expressions: v0alpha1.AlertRuleExpressionMap{
"A": {
QueryType: util.Pointer(rule.Data[0].QueryType),
DatasourceUID: util.Pointer(v0alpha1.AlertRuleDatasourceUID(rule.Data[0].DatasourceUID)),
Model: rule.Data[0].Model,
Source: util.Pointer(true),
RelativeTimeRange: &v0alpha1.AlertRuleRelativeTimeRange{
From: v0alpha1.AlertRulePromDurationWMillis("5m"),
To: v0alpha1.AlertRulePromDurationWMillis("0s"),
},
},
},
Trigger: v0alpha1.AlertRuleIntervalTrigger{Interval: v0alpha1.AlertRulePromDuration("10s")},
NoDataState: "NoData",
ExecErrState: "Error",
},
}
created, err := client.Create(ctx, alertRule, v1.CreateOptions{})
require.Error(t, err)
require.Nil(t, created)
})
}

View File

@@ -454,7 +454,7 @@ func TestIntegrationCRUD(t *testing.T) {
}
created, err := adminClient.Create(ctx, recordingRule, v1.CreateOptions{})
require.ErrorContains(t, err, "invalid alert rule")
require.ErrorContains(t, err, "trigger interval must be a multiple of base evaluation interval")
require.Nil(t, created)
})
}
@@ -557,3 +557,139 @@ func TestIntegrationBasicAPI(t *testing.T) {
t.Logf("Got error: %s", err)
})
}
func TestIntegrationFolderLabelSyncAndValidation(t *testing.T) {
testutil.SkipIntegrationTestInShortMode(t)
ctx := context.Background()
helper := common.GetTestHelper(t)
client := common.NewRecordingRuleClient(t, helper.Org1.Admin)
// Prepare two folders for label sync update scenario
common.CreateTestFolder(t, helper, "test-folder-a")
common.CreateTestFolder(t, helper, "test-folder-b")
baseGen := ngmodels.RuleGen.With(
ngmodels.RuleMuts.WithUniqueUID(),
ngmodels.RuleMuts.WithUniqueTitle(),
ngmodels.RuleMuts.WithNamespaceUID("test-folder-a"),
ngmodels.RuleMuts.WithGroupName("test-group"),
ngmodels.RuleMuts.WithAllRecordingRules(),
ngmodels.RuleMuts.WithIntervalMatching(time.Duration(10)*time.Second),
)
t.Run("should keep folder label in sync with folder annotation on create and update", func(t *testing.T) {
rule := baseGen.Generate()
recordingRule := &v0alpha1.RecordingRule{
ObjectMeta: v1.ObjectMeta{
Namespace: "default",
Annotations: map[string]string{
v0alpha1.FolderAnnotationKey: "test-folder-a",
},
},
Spec: v0alpha1.RecordingRuleSpec{
Title: rule.Title,
Metric: rule.Record.Metric,
Expressions: v0alpha1.RecordingRuleExpressionMap{
"A": {
QueryType: util.Pointer(rule.Data[0].QueryType),
DatasourceUID: util.Pointer(v0alpha1.RecordingRuleDatasourceUID(rule.Data[0].DatasourceUID)),
Model: rule.Data[0].Model,
Source: util.Pointer(true),
RelativeTimeRange: &v0alpha1.RecordingRuleRelativeTimeRange{
From: v0alpha1.RecordingRulePromDurationWMillis("5m"),
To: v0alpha1.RecordingRulePromDurationWMillis("0s"),
},
},
},
Trigger: v0alpha1.RecordingRuleIntervalTrigger{Interval: v0alpha1.RecordingRulePromDuration("10s")},
},
}
created, err := client.Create(ctx, recordingRule, v1.CreateOptions{})
require.NoError(t, err)
defer func() { _ = client.Delete(ctx, created.Name, v1.DeleteOptions{}) }()
// On create, metadata.labels[v0alpha1.FolderLabelKey] should mirror annotation
require.Equal(t, "test-folder-a", created.Labels[v0alpha1.FolderLabelKey])
updated := created.Copy().(*v0alpha1.RecordingRule)
if updated.Annotations == nil {
updated.Annotations = map[string]string{}
}
updated.Annotations[v0alpha1.FolderAnnotationKey] = "test-folder-b"
after, err := client.Update(ctx, updated, v1.UpdateOptions{})
require.NoError(t, err)
require.Equal(t, "test-folder-b", after.Annotations[v0alpha1.FolderAnnotationKey])
require.Equal(t, "test-folder-b", after.Labels[v0alpha1.FolderLabelKey])
})
t.Run("should fail to create recording rule without folder annotation", func(t *testing.T) {
rule := baseGen.Generate()
recordingRule := &v0alpha1.RecordingRule{
ObjectMeta: v1.ObjectMeta{
Namespace: "default",
Annotations: map[string]string{},
},
Spec: v0alpha1.RecordingRuleSpec{
Title: rule.Title,
Metric: rule.Record.Metric,
Expressions: v0alpha1.RecordingRuleExpressionMap{
"A": {
QueryType: util.Pointer(rule.Data[0].QueryType),
DatasourceUID: util.Pointer(v0alpha1.RecordingRuleDatasourceUID(rule.Data[0].DatasourceUID)),
Model: rule.Data[0].Model,
Source: util.Pointer(true),
RelativeTimeRange: &v0alpha1.RecordingRuleRelativeTimeRange{
From: v0alpha1.RecordingRulePromDurationWMillis("5m"),
To: v0alpha1.RecordingRulePromDurationWMillis("0s"),
},
},
},
Trigger: v0alpha1.RecordingRuleIntervalTrigger{Interval: v0alpha1.RecordingRulePromDuration("10s")},
},
}
created, err := client.Create(ctx, recordingRule, v1.CreateOptions{})
require.Error(t, err)
require.Nil(t, created)
})
t.Run("should fail to create rule with group labels preset", func(t *testing.T) {
rule := baseGen.Generate()
recordingRule := &v0alpha1.RecordingRule{
ObjectMeta: v1.ObjectMeta{
Namespace: "default",
Annotations: map[string]string{
v0alpha1.FolderAnnotationKey: "test-folder-a",
},
Labels: map[string]string{
v0alpha1.GroupLabelKey: "some-group",
v0alpha1.GroupIndexLabelKey: "0",
},
},
Spec: v0alpha1.RecordingRuleSpec{
Title: rule.Title,
Metric: rule.Record.Metric,
Expressions: v0alpha1.RecordingRuleExpressionMap{
"A": {
QueryType: util.Pointer(rule.Data[0].QueryType),
DatasourceUID: util.Pointer(v0alpha1.RecordingRuleDatasourceUID(rule.Data[0].DatasourceUID)),
Model: rule.Data[0].Model,
Source: util.Pointer(true),
RelativeTimeRange: &v0alpha1.RecordingRuleRelativeTimeRange{
From: v0alpha1.RecordingRulePromDurationWMillis("5m"),
To: v0alpha1.RecordingRulePromDurationWMillis("0s"),
},
},
},
Trigger: v0alpha1.RecordingRuleIntervalTrigger{Interval: v0alpha1.RecordingRulePromDuration("10s")},
},
}
created, err := client.Create(ctx, recordingRule, v1.CreateOptions{})
require.Error(t, err)
require.Nil(t, created)
})
}

View File

@@ -0,0 +1,182 @@
package provisioning
import (
"context"
"fmt"
"net/http"
"testing"
"github.com/stretchr/testify/require"
metav1 "k8s.io/apimachinery/pkg/apis/meta/v1"
"k8s.io/apimachinery/pkg/apis/meta/v1/unstructured"
provisioning "github.com/grafana/grafana/apps/provisioning/pkg/apis/provisioning/v0alpha1"
"github.com/grafana/grafana/pkg/util/testutil"
)
func TestIntegrationProvisioning_JobValidation(t *testing.T) {
testutil.SkipIntegrationTestInShortMode(t)
helper := runGrafana(t)
ctx := context.Background()
// Create a test repository first
const repo = "job-validation-test-repo"
testRepo := TestRepo{
Name: repo,
Target: "instance",
Copies: map[string]string{},
ExpectedDashboards: 0,
ExpectedFolders: 0,
}
helper.CreateRepo(t, testRepo)
tests := []struct {
name string
jobSpec map[string]interface{}
expectedErr string
}{
{
name: "job without action",
jobSpec: map[string]interface{}{
"repository": repo,
},
expectedErr: "spec.action: Required value: action must be specified",
},
{
name: "job with invalid action",
jobSpec: map[string]interface{}{
"action": "invalid-action",
"repository": repo,
},
expectedErr: "spec.action: Invalid value: \"invalid-action\": invalid action",
},
{
name: "pull job without pull options",
jobSpec: map[string]interface{}{
"action": string(provisioning.JobActionPull),
"repository": repo,
},
expectedErr: "spec.pull: Required value: pull options required for pull action",
},
{
name: "push job without push options",
jobSpec: map[string]interface{}{
"action": string(provisioning.JobActionPush),
"repository": repo,
},
expectedErr: "spec.push: Required value: push options required for push action",
},
{
name: "push job with invalid branch name",
jobSpec: map[string]interface{}{
"action": string(provisioning.JobActionPush),
"repository": repo,
"push": map[string]interface{}{
"branch": "feature..branch", // Invalid: consecutive dots
"message": "Test commit",
},
},
expectedErr: "spec.push.branch: Invalid value: \"feature..branch\": invalid git branch name",
},
{
name: "push job with path traversal",
jobSpec: map[string]interface{}{
"action": string(provisioning.JobActionPush),
"repository": repo,
"push": map[string]interface{}{
"path": "../../etc/passwd", // Invalid: path traversal
"message": "Test commit",
},
},
expectedErr: "spec.push.path: Invalid value: \"../../etc/passwd\"",
},
{
name: "delete job without paths or resources",
jobSpec: map[string]interface{}{
"action": string(provisioning.JobActionDelete),
"repository": repo,
"delete": map[string]interface{}{},
},
expectedErr: "spec.delete: Required value: at least one path or resource must be specified",
},
{
name: "delete job with invalid path",
jobSpec: map[string]interface{}{
"action": string(provisioning.JobActionDelete),
"repository": repo,
"delete": map[string]interface{}{
"paths": []string{"../invalid/path"},
},
},
expectedErr: "spec.delete.paths[0]: Invalid value: \"../invalid/path\"",
},
{
name: "move job without target path",
jobSpec: map[string]interface{}{
"action": string(provisioning.JobActionMove),
"repository": repo,
"move": map[string]interface{}{
"paths": []string{"dashboard.json"},
},
},
expectedErr: "spec.move.targetPath: Required value: target path is required",
},
{
name: "move job without paths or resources",
jobSpec: map[string]interface{}{
"action": string(provisioning.JobActionMove),
"repository": repo,
"move": map[string]interface{}{
"targetPath": "new-location/",
},
},
expectedErr: "spec.move: Required value: at least one path or resource must be specified",
},
{
name: "move job with invalid target path",
jobSpec: map[string]interface{}{
"action": string(provisioning.JobActionMove),
"repository": repo,
"move": map[string]interface{}{
"paths": []string{"dashboard.json"},
"targetPath": "../../../etc/", // Invalid: path traversal
},
},
expectedErr: "spec.move.targetPath: Invalid value: \"../../../etc/\"",
},
{
name: "migrate job without migrate options",
jobSpec: map[string]interface{}{
"action": string(provisioning.JobActionMigrate),
"repository": repo,
},
expectedErr: "spec.migrate: Required value: migrate options required for migrate action",
},
}
for i, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
// Create the job object directly
jobObj := &unstructured.Unstructured{
Object: map[string]interface{}{
"apiVersion": "provisioning.grafana.app/v0alpha1",
"kind": "Job",
"metadata": map[string]interface{}{
"name": fmt.Sprintf("test-job-validation-%d", i),
"namespace": "default",
},
"spec": tt.jobSpec,
},
}
// Try to create the job - should fail with validation error
_, err := helper.Jobs.Resource.Create(ctx, jobObj, metav1.CreateOptions{})
require.Error(t, err, "expected validation error for invalid job spec")
// Verify it's a validation error with correct status code
statusError := helper.RequireApiErrorStatus(err, metav1.StatusReasonInvalid, http.StatusUnprocessableEntity)
require.Contains(t, statusError.Message, tt.expectedErr, "error message should contain expected validation message")
})
}
}

View File

@@ -171,7 +171,7 @@ func TestIntegrationProvisioning_MoveJob(t *testing.T) {
})
t.Run("move without target path", func(t *testing.T) {
// Create move job without target path (should fail)
// Create move job without target path (should fail validation at creation time)
spec := provisioning.JobSpec{
Action: provisioning.JobActionMove,
Move: &provisioning.MoveJobOptions{
@@ -180,9 +180,20 @@ func TestIntegrationProvisioning_MoveJob(t *testing.T) {
},
}
job := helper.TriggerJobAndWaitForComplete(t, repo, spec)
state := mustNestedString(job.Object, "status", "state")
assert.Equal(t, "error", state, "move job should have failed due to missing target path")
// The job should be rejected by the admission controller with validation error
body := asJSON(&spec)
result := helper.AdminREST.Post().
Namespace("default").
Resource("repositories").
Name(repo).
SubResource("jobs").
Body(body).
SetHeader("Content-Type", "application/json").
Do(ctx)
require.Error(t, result.Error(), "move job without target path should fail validation")
statusError := helper.RequireApiErrorStatus(result.Error(), metav1.StatusReasonInvalid, 422)
require.Contains(t, statusError.Message, "spec.move.targetPath", "error should mention missing target path")
})
t.Run("move by resource reference", func(t *testing.T) {

View File

@@ -1,4 +1,4 @@
import type { BooleanFieldSettings } from '@react-awesome-query-builder/ui';
import { BooleanFieldSettings } from '@react-awesome-query-builder/ui';
import {
FieldConfigPropertyItem,

View File

@@ -44,7 +44,7 @@ describe('CommandPalette', () => {
// Check if empty state message is rendered
expect(await screen.findByText('No results found')).toBeInTheDocument();
// Check if AI Assistant button is rendered with correct props
expect(screen.getByRole('button', { name: 'Try searching with Grafana Assistant' })).toBeInTheDocument();
expect(screen.getByRole('button', { name: 'Search with Grafana Assistant' })).toBeInTheDocument();
});
it('should render empty state without AI Assistant button when assistant is not available', async () => {
@@ -55,6 +55,6 @@ describe('CommandPalette', () => {
// Check if empty state message is rendered
expect(await screen.findByText('No results found')).toBeInTheDocument();
// Check that AI Assistant button is not rendered
expect(screen.queryByRole('button', { name: 'Try searching with Grafana Assistant' })).not.toBeInTheDocument();
expect(screen.queryByRole('button', { name: 'Search with Grafana Assistant' })).not.toBeInTheDocument();
});
});

View File

@@ -189,7 +189,7 @@ const RenderResults = ({ isFetchingSearchResults, searchResults, searchQuery }:
<OpenAssistantButton
origin="grafana/command-palette-empty-state"
prompt={`Search for ${searchQuery}`}
title={t('command-palette.empty-state.button-title', 'Try searching with Grafana Assistant')}
title={t('command-palette.empty-state.button-title', 'Search with Grafana Assistant')}
onClick={query.toggle}
/>
)}

View File

@@ -33,7 +33,6 @@ import {
AdHocFilterItem,
} from '@grafana/ui';
import appEvents from 'app/core/app_events';
import config from 'app/core/config';
import { profiler } from 'app/core/profiler';
import { annotationServer } from 'app/features/annotations/api';
import { applyPanelTimeOverrides } from 'app/features/dashboard/utils/panel';
@@ -121,7 +120,7 @@ export class PanelStateWrapper extends PureComponent<Props, State> {
data: this.getInitialPanelDataState(),
};
if (config.featureToggles.panelMonitoring && this.getPanelContextApp() === CoreApp.PanelEditor) {
if (this.getPanelContextApp() === CoreApp.PanelEditor) {
const panelInfo = {
panelId: String(props.panel.id),
panelType: props.panel.type,
@@ -395,7 +394,7 @@ export class PanelStateWrapper extends PureComponent<Props, State> {
}
onPanelError = (error: Error) => {
if (config.featureToggles.panelMonitoring && this.getPanelContextApp() === CoreApp.PanelEditor) {
if (this.getPanelContextApp() === CoreApp.PanelEditor) {
this.logPanelChangesOnError();
}
@@ -543,7 +542,7 @@ export class PanelStateWrapper extends PureComponent<Props, State> {
onChangeTimeRange={this.onChangeTimeRange}
eventBus={dashboard.events}
/>
{config.featureToggles.panelMonitoring && this.state.errorMessage === undefined && (
{this.state.errorMessage === undefined && (
<PanelLoadTimeMonitor panelType={plugin.meta.id} panelId={panel.id} panelTitle={panel.title} />
)}
</PanelContextProvider>

View File

@@ -1,12 +1,6 @@
import { css } from '@emotion/css';
import {
CoreApp,
GrafanaTheme2,
PanelDataSummary,
VisualizationSuggestionsBuilder,
VisualizationSuggestion,
} from '@grafana/data';
import { CoreApp, getPanelDataSummary, GrafanaTheme2, PanelDataSummary, VisualizationSuggestion } from '@grafana/data';
import { selectors } from '@grafana/e2e-selectors';
import { t, Trans } from '@grafana/i18n';
import { PanelDataErrorViewProps, locationService } from '@grafana/runtime';
@@ -27,8 +21,7 @@ import { changePanelPlugin } from '../state/actions';
export function PanelDataErrorView(props: PanelDataErrorViewProps) {
const styles = useStyles2(getStyles);
const context = usePanelContext();
const builder = new VisualizationSuggestionsBuilder(props.data);
const { dataSummary } = builder;
const dataSummary = getPanelDataSummary(props.data.series);
const message = getMessageFor(props, dataSummary);
const dispatch = useDispatch();

View File

@@ -38,6 +38,7 @@ import { getHasTokenInstructions } from '../utils/git';
import { getRepositoryTypeConfig, isGitProvider } from '../utils/repositoryTypes';
import { ConfigFormGithubCollapse } from './ConfigFormGithubCollapse';
import { EnablePushToConfiguredBranchOption } from './EnablePushToConfiguredBranchOption';
import { getDefaultValues } from './defaults';
// This needs to be a function for translations to work
@@ -303,6 +304,7 @@ export function ConfigForm({ data }: ConfigFormProps) {
onChange: (e) => {
if (e.target.checked) {
setValue('prWorkflow', false);
setValue('enablePushToConfiguredBranch', false);
}
},
})}
@@ -324,6 +326,13 @@ export function ConfigForm({ data }: ConfigFormProps) {
/>
</Field>
)}
{isGitBased && (
<EnablePushToConfiguredBranchOption<RepositoryFormData>
register={register}
registerName="enablePushToConfiguredBranch"
readOnly={readOnly}
/>
)}
{type === 'github' && <ConfigFormGithubCollapse register={register} />}
{isGitBased && (

View File

@@ -0,0 +1,28 @@
import { FieldValues, UseFormRegister, Path } from 'react-hook-form';
import { t } from '@grafana/i18n';
import { Checkbox, Field } from '@grafana/ui';
export function EnablePushToConfiguredBranchOption<T extends FieldValues>({
register,
registerName,
readOnly,
}: {
register: UseFormRegister<T>;
registerName: Path<T>;
readOnly: boolean;
}) {
return (
<Field noMargin>
<Checkbox
disabled={readOnly}
{...register(registerName)}
label={t('provisioning.enable-push-to-configured-branch-label', 'Enable push to configured branch')}
description={t(
'provisioning.enable-push-to-configured-branch-description',
'Allow direct commits to the configured branch.'
)}
/>
</Field>
);
}

View File

@@ -31,6 +31,7 @@ export function getDefaultValues({
target: defaultTarget,
intervalSeconds: 60,
},
enablePushToConfiguredBranch: true,
};
}
return specToData(repository);

View File

@@ -5,6 +5,7 @@ import { Trans, t } from '@grafana/i18n';
import { Checkbox, Field, Input, Stack, Text, TextLink } from '@grafana/ui';
import { useGetFrontendSettingsQuery } from 'app/api/clients/provisioning/v0alpha1';
import { EnablePushToConfiguredBranchOption } from '../Config/EnablePushToConfiguredBranchOption';
import { checkImageRenderer, checkImageRenderingAllowed, checkPublicAccess } from '../GettingStarted/features';
import { isGitProvider } from '../utils/repositoryTypes';
@@ -68,6 +69,7 @@ export const FinishStep = memo(function FinishStep() {
onChange: (e) => {
if (e.target.checked) {
setValue('repository.prWorkflow', false);
setValue('repository.enablePushToConfiguredBranch', false);
}
},
})}
@@ -90,6 +92,14 @@ export const FinishStep = memo(function FinishStep() {
</Field>
)}
{isGitBased && (
<EnablePushToConfiguredBranchOption<WizardFormData>
register={register}
readOnly={readOnly}
registerName="repository.enablePushToConfiguredBranch"
/>
)}
{isGithub && imageRenderingAllowed && (
<Field noMargin>
<Checkbox

View File

@@ -44,6 +44,7 @@ export type RepositoryFormData = Omit<RepositorySpec, 'workflows' | RepositorySp
LocalRepositoryConfig & {
readOnly: boolean;
prWorkflow: boolean;
enablePushToConfiguredBranch: boolean;
// top-level inline secure value
token?: string;
};

View File

@@ -6,7 +6,7 @@ export const getWorkflows = (data: RepositoryFormData): RepositorySpec['workflow
if (data.readOnly) {
return [];
}
const workflows: RepositorySpec['workflows'] = ['write'];
const workflows: RepositorySpec['workflows'] = data.enablePushToConfiguredBranch ? ['write'] : [];
if (!data.prWorkflow) {
return workflows;
@@ -69,6 +69,7 @@ export const specToData = (spec: RepositorySpec): RepositoryFormData => {
generateDashboardPreviews: spec.github?.generateDashboardPreviews || false,
readOnly: !spec.workflows.length,
prWorkflow: spec.workflows.includes('branch'),
enablePushToConfiguredBranch: spec.workflows.includes('write'),
});
};

View File

@@ -4130,7 +4130,7 @@
"scopes": "Scopes"
},
"empty-state": {
"button-title": "Try searching with Grafana Assistant",
"button-title": "Search with Grafana Assistant",
"message": "No results found"
},
"scopes": {
@@ -11510,6 +11510,8 @@
"empty-state": {
"no-jobs": "No jobs..."
},
"enable-push-to-configured-branch-description": "Allow direct commits to the configured branch.",
"enable-push-to-configured-branch-label": "Enable push to configured branch",
"enhanced-features": {
"description": "Get the most out of your GitHub integration with these optional add-ons",
"description-instant-updates": "Get instant updates in Grafana as soon as changes are committed. Review and approve changes using pull requests before they go live.",

View File

@@ -3655,8 +3655,7 @@ __metadata:
"@grafana/plugin-ui": "npm:^0.10.10"
"@grafana/runtime": "npm:12.4.0-pre"
"@grafana/ui": "npm:12.4.0-pre"
"@react-awesome-query-builder/core": "npm:^6.7.0-alpha.0"
"@react-awesome-query-builder/ui": "npm:^6.7.0-alpha.0"
"@react-awesome-query-builder/ui": "npm:6.6.15"
"@testing-library/dom": "npm:10.4.1"
"@testing-library/jest-dom": "npm:^6.1.2"
"@testing-library/react": "npm:16.3.0"
@@ -6937,24 +6936,7 @@ __metadata:
languageName: node
linkType: hard
"@react-awesome-query-builder/core@npm:^6.7.0-alpha.0":
version: 6.7.0-alpha.0
resolution: "@react-awesome-query-builder/core@npm:6.7.0-alpha.0"
dependencies:
"@babel/runtime": "npm:^7.27.0"
clone: "npm:^2.1.2"
i18next: "npm:^23.11.5"
immutable: "npm:^4.3.6"
json-logic-js: "npm:^2.0.2"
lodash: "npm:^4.17.21"
moment: "npm:^2.30.1"
spel2js: "npm:^0.2.8"
sqlstring: "npm:^2.3.3"
checksum: 10/2039cc283204567864a01ee49b44fae52e7ddc5f14cdf9e5e5f0728c483a7afa0d4c05d364a4e50413045d2dd0f5c713508dadc34e0f98ad371fcd5cfde86862
languageName: node
linkType: hard
"@react-awesome-query-builder/ui@npm:^6.6.4":
"@react-awesome-query-builder/ui@npm:6.6.15, @react-awesome-query-builder/ui@npm:^6.6.4":
version: 6.6.15
resolution: "@react-awesome-query-builder/ui@npm:6.6.15"
dependencies:
@@ -6972,26 +6954,6 @@ __metadata:
languageName: node
linkType: hard
"@react-awesome-query-builder/ui@npm:^6.7.0-alpha.0":
version: 6.7.0-alpha.0
resolution: "@react-awesome-query-builder/ui@npm:6.7.0-alpha.0"
dependencies:
"@babel/runtime": "npm:^7.27.0"
"@react-awesome-query-builder/core": "npm:^6.7.0-alpha.0"
chroma-js: "npm:^3.1.2"
classnames: "npm:^2.5.1"
lodash: "npm:^4.17.21"
prop-types: "npm:^15.8.1"
react-number-format: "npm:^5.0.0"
react-redux: "npm:^8.1.3"
redux: "npm:^4.2.1"
peerDependencies:
react: ^16.8.4 || ^17.0.1 || ^18.0.0 || ^19.0.0
react-dom: ^16.8.4 || ^17.0.1 || ^18.0.0 || ^19.0.0
checksum: 10/1f39a807672a94af219c58638f67b5cf9b0e43523c0c5718a12cba677e86d1b669c5715fe8920448139d35bc17b73b168cd581fd4d5bbbf0aaeec1780c58bceb
languageName: node
linkType: hard
"@react-stately/flags@npm:^3.1.2":
version: 3.1.2
resolution: "@react-stately/flags@npm:3.1.2"
@@ -13448,13 +13410,6 @@ __metadata:
languageName: node
linkType: hard
"chroma-js@npm:^3.1.2":
version: 3.1.2
resolution: "chroma-js@npm:3.1.2"
checksum: 10/ea09b27d04b477a8fdc3314bfa6dc8de05feab47999ff1acdeba6cf3c32c76338baa5b4562d949d33d8adeca57754e5ea591c15b9985d65583c8771e7e07c2ef
languageName: node
linkType: hard
"chrome-remote-interface@npm:0.33.3":
version: 0.33.3
resolution: "chrome-remote-interface@npm:0.33.3"
@@ -18941,8 +18896,7 @@ __metadata:
"@react-aria/focus": "npm:3.21.2"
"@react-aria/overlays": "npm:3.30.0"
"@react-aria/utils": "npm:3.31.0"
"@react-awesome-query-builder/core": "npm:^6.7.0-alpha.0"
"@react-awesome-query-builder/ui": "npm:^6.7.0-alpha.0"
"@react-awesome-query-builder/ui": "npm:6.6.15"
"@react-types/button": "npm:3.13.0"
"@react-types/menu": "npm:3.10.3"
"@react-types/overlays": "npm:3.9.0"