Compare commits

..

36 Commits

Author SHA1 Message Date
Robby Milo
ca6d08d5cb Docs: Fix Broken Link (#22894)
(cherry picked from commit a61af9ed1d)
2020-03-20 14:13:09 +01:00
Leonard Gram
d01bdb517d release 6.7.1 2020-03-20 14:13:09 +01:00
Torkel Ödegaard
63dfdb7066 Panels: Fixed size issue with panels when existing panel edit mode (#22912)
(cherry picked from commit 8e131384e3)
2020-03-20 14:13:09 +01:00
Torkel Ödegaard
e95667fffb Azure: Fixed dropdowns not showing current value (#22914)
(cherry picked from commit d16211b782)
2020-03-20 14:13:09 +01:00
Hugo Häggmark
c08b901664 BackendSrv: only add content-type on POST, PUT requests (#22910)
* BackendSrv: only add content-type on POST, PUT requests
Fixes #22869

* Tests: imports polyfill for Headers

(cherry picked from commit 8d5c6053db)
2020-03-20 14:13:09 +01:00
Cyril Tovena
7cd6fef466 Check if the datasource is of type loki using meta.id instead of name. (#22877)
Signed-off-by: Cyril Tovena <cyril.tovena@gmail.com>
(cherry picked from commit ec9167e972)
2020-03-20 14:13:09 +01:00
Arve Knudsen
1b4f93b88c CircleCI: Pin grabpl to 0.1.0 (#22904) 2020-03-19 19:06:28 +01:00
Arve Knudsen
c4656a885d Release version 6.7.0
Signed-off-by: Arve Knudsen <arve.knudsen@gmail.com>
2020-03-19 12:27:02 +01:00
Ivana Huckova
818a2f3d64 Design tweaks (#22886)
(cherry picked from commit 8ba75e77b1)
2020-03-19 12:27:02 +01:00
Jess
7f52e023b5 Rich history UX fixes (#22783)
* Initial commit

* Visualised renamed or deleted  datasources as well, if they have queries

* Pass ds image to card and information if the datasource was removed/renamed

* Set up card with datasource info and change run query

* Style comment, run button

* Fix button naming

* Remember last filters

* Update public/app/core/store.ts

* Update public/app/features/explore/RichHistory/RichHistory.tsx

* Update comments

* Rename datasource to data source

* Add test coverage, fix naming

* Remove unused styles, add feedback info

Co-authored-by: Ivana <ivana.huckova@gmail.com>
Co-authored-by: Ivana Huckova <30407135+ivanahuckova@users.noreply.github.com>
(cherry picked from commit db85c3e7b9)
2020-03-19 12:27:02 +01:00
Daniel Lee
962a06545a AzureMonitor: support workspaces function for template variables (#22882)
* azuremonitor: adds support for workspaces query macro...

...for Azure Logs template variable queries

* docs: azure logs workspaces templating function

* Update docs/sources/features/datasources/azuremonitor.md

Co-Authored-By: Diana Payton <52059945+oddlittlebird@users.noreply.github.com>

* docs: convert list into table

* docs: fixes prettier formatting problem

Prettier adds a slash before dollar signs in markdown. Disabling it
for this table with a prettier comment.

https://prettier.io/docs/en/ignore.html

Co-authored-by: Diana Payton <52059945+oddlittlebird@users.noreply.github.com>
(cherry picked from commit 3b9a4e6444)
2020-03-19 12:27:02 +01:00
Arve Knudsen
79aeeaa10a SQLStore: Add migration for adding index on annotation.alert_id (#22876)
(cherry picked from commit bb05989e43)
2020-03-19 12:27:02 +01:00
Carl Bergquist
ea483c0ce1 Plugins: Return jsondetails as an json object instead of raw json on datasource healthchecks. (#22859)
(cherry picked from commit 579abad9cc)
2020-03-19 12:27:02 +01:00
Marcus Efraimsson
3e88197f96 Backend plugins: Exclude plugin metrics in Grafana's metrics endpoint (#22857)
Excludes backend plugin metrics in Grafana's metrics endpoint
Adds /api/:pluginId/metrics endpoint for retrieving metrics
from backend plugin as Prometheus text-based exposition format.

Fixes #22814

(cherry picked from commit 60e3437fc1)
2020-03-19 12:27:02 +01:00
Torkel Ödegaard
66df54db80 Graphite: Fixed issue with query editor and next select metric now showing after selecting metric node (#22856)
* Graphite: Fixed digest issue in graphite query editor

* Fixed unit test

* Updated

(cherry picked from commit aa4ed76a00)
2020-03-19 12:27:02 +01:00
Erik Sundell
e4b4480064 Stackdriver: Fix GCE auth bug when creating new data source (#22836)
* Fix test datasource for gce auth

* Cache gce default project locally

* Await gce default project call

* Remove reload functionality

* Fix build problem

(cherry picked from commit 1cd7ce24c7)
2020-03-19 12:27:02 +01:00
Marcus Efraimsson
4e4f69b5f6 @grafana/runtime: Add cancellation of queries to DataSourceWithBackend (#22818)
(cherry picked from commit eb96a8fcc8)
2020-03-19 12:27:02 +01:00
Ivana Huckova
a4b7209e39 Rich history: Test coverage (#22852)
* Add unit test coverage

* Add tests to util/richHistory

* Remove unused import

* Remove redundant tests

* Fix tests for components

* Test saving to local storage

* Add boxshadow to container

* Revert "Add boxshadow to container"

This reverts commit 5ca2e850e4.

* Fix failing tests after merging master

* Fix imports, aria-labels

* Remove console.log

(cherry picked from commit 8edf8e3982)
2020-03-19 12:27:02 +01:00
Carl Bergquist
6c001d9c09 Datasource config was not mapped for datasource healthcheck (#22848)
closes #22825

(cherry picked from commit 0a094a7319)
2020-03-19 12:27:02 +01:00
Carl Bergquist
312600aa2c upgrades plugin sdk to 0.30.0 (#22846)
ref grafana/grafana-plugin-sdk-go#94
ref grafana/grafana-plugin-sdk-go#70

(cherry picked from commit b0407b3578)
2020-03-19 12:27:02 +01:00
Ivana Huckova
26d701dcf9 Rich History: UX adjustments and fixes (#22729)
* Initial commit

* Fix spelling of data sources

* Display sorting value for starred and query tab

* Fix handle color for light theme

* Add close button and fix animation

* Remove toggling of tabs

* Stop event propagation when clicking on comment buttons

* Add title for card functionality

* Remove interpolation for easier searchability of variables

* Improve syncing of comments and starred

* Add modal to check if user wants to permanently delete history

* Fix the height of the query card buttons

* Adjust slider's width based on drawer width

* Add spacing between slider and legend

* Semantic variable naming

* Fix disabled button when live tailing

* Add error handling

* Remove unused imports

* Fix starring, remove useEffect

* Remove emiting of appEvents.alertError in store

* Remove unused imports

(cherry picked from commit 544690060a)
2020-03-19 12:27:02 +01:00
Dominik Prokop
e347b62cee TablePanel: Enable new units picker (#22833)
(cherry picked from commit 58298919c8)
2020-03-19 12:27:02 +01:00
Alex Khomenko
36232857df Fix dashboard picker's props (#22815)
(cherry picked from commit 2fac834413)
2020-03-19 12:27:02 +01:00
Alex Khomenko
9d605bdd04 Grafana-UI: Add invalid state to Forms.Textarea (#22775)
(cherry picked from commit cd50da3dbe)
2020-03-19 12:27:02 +01:00
Torkel Ödegaard
4d235b978e SaveDashboard: Updated modal design/layout a bit (#22810)
(cherry picked from commit 46165a7f7b)
2020-03-19 12:27:02 +01:00
Torkel Ödegaard
6575c9cb6e Forms: Fix input suffix position (#22780)
* Forms: Fix input suffix position

* Update

(cherry picked from commit ab0238eced)
2020-03-19 12:27:02 +01:00
Torkel Ödegaard
0ad27a6596 AngularPanels: Fixed inner height calculation (#22796)
(cherry picked from commit f78501f3b5)
2020-03-19 12:27:02 +01:00
Hugo Häggmark
a0c6afa0a5 Fix: fixes issue with headers property with different casing (#22778)
Fixes #22756

(cherry picked from commit b30f4c7bb0)
2020-03-19 12:27:02 +01:00
Ryan McKinley
3d0bc141c7 DataSourceWithBackend: use /health endpoint for test (#22789)
(cherry picked from commit 8b067a5fe0)
2020-03-19 12:27:02 +01:00
Ryan McKinley
ed307897e7 Chore: remove expressions flag and allow (#22764)
(cherry picked from commit c65db9bf25)
2020-03-19 12:27:02 +01:00
Alex Khomenko
1d63f57caf Core: Pass the rest of to props to Select (#22776)
* Pass the rest of to props to Select

* Remove log

(cherry picked from commit 451c95808d)
2020-03-19 12:27:02 +01:00
Carl Bergquist
e00f393a17 Add support for sending health check to datasource plugins. (#22771)
closes #21519
ref grafana/grafana-plugin-sdk-go#93

(cherry picked from commit ebc9549cbc)
2020-03-19 12:27:02 +01:00
Marcus Andersson
277e00aaed Datasource: making sure we are having the same data field order when using mixed data sources. (#22718)
* changed so data query response always it returned in the correct order when using mixed data sources.

* refactored the code to make it a bit simpler and not failing the tests.

* changed to simple array type.

(cherry picked from commit f44c0f0643)
2020-03-19 12:27:02 +01:00
Dominik Prokop
ba6104190e DashboardSave: Autofocus save dashboard form input (#22748)
(cherry picked from commit b441b73345)
2020-03-19 12:27:02 +01:00
Steven Vachon
eaaca91f25 @grafana/e2e: cherry picked 4fecf5a7a6 (#22739) 2020-03-12 08:51:05 +01:00
Arve Knudsen
a551cd2470 Release version 6.7.0-beta1 (#22727) 2020-03-11 16:30:55 +01:00
12407 changed files with 1533834 additions and 1150706 deletions

14
.babelrc Normal file
View File

@@ -0,0 +1,14 @@
{
"presets": [
[
"@babel/preset-env",
{
"targets": {
"browsers": "last 3 versions"
},
"useBuiltIns": "entry",
"modules": "false",
}
]
]
}

File diff suppressed because it is too large Load Diff

View File

@@ -1,69 +0,0 @@
import { regexp } from '@betterer/regexp';
import { BettererFileTest } from '@betterer/betterer';
import { ESLint, Linter } from 'eslint';
import { existsSync } from 'fs';
export default {
'no enzyme tests': () => regexp(/from 'enzyme'/g).include('**/*.test.*'),
'better eslint': () => countEslintErrors().include('**/*.{ts,tsx}'),
'no undocumented stories': () => countUndocumentedStories().include('**/*.story.tsx'),
};
function countUndocumentedStories() {
return new BettererFileTest(async (filePaths, fileTestResult) => {
filePaths.forEach((filePath) => {
if (!existsSync(filePath.replace(/\.story.tsx$/, '.mdx'))) {
// In this case the file contents don't matter:
const file = fileTestResult.addFile(filePath, '');
// Add the issue to the first character of the file:
file.addIssue(0, 0, 'No undocumented stories are allowed, please add an .mdx file with some documentation');
}
});
});
}
function countEslintErrors() {
return new BettererFileTest(async (filePaths, fileTestResult, resolver) => {
const { baseDirectory } = resolver;
const cli = new ESLint({ cwd: baseDirectory });
await Promise.all(
filePaths.map(async (filePath) => {
const linterOptions = (await cli.calculateConfigForFile(filePath)) as Linter.Config;
const rules: Partial<Linter.RulesRecord> = {
'@typescript-eslint/no-explicit-any': 'error',
};
if (!filePath.endsWith('.test.tsx') && !filePath.endsWith('.test.ts')) {
rules['@typescript-eslint/consistent-type-assertions'] = [
'error',
{
assertionStyle: 'never',
},
];
}
const runner = new ESLint({
baseConfig: {
...linterOptions,
rules,
},
useEslintrc: false,
cwd: baseDirectory,
});
const lintResults = await runner.lintFiles([filePath]);
lintResults
.filter((lintResult) => lintResult.source)
.forEach((lintResult) => {
const { messages } = lintResult;
const file = fileTestResult.addFile(filePath, '');
messages.forEach((message, index) => {
file.addIssue(0, 0, message.message, `${index}`);
});
});
})
);
});
}

12
.bingo/.gitignore vendored
View File

@@ -1,12 +0,0 @@
# Ignore everything
*
# But not these files:
!.gitignore
!*.mod
!README.md
!Variables.mk
!variables.env
*tmp.mod

View File

@@ -1,14 +0,0 @@
# Project Development Dependencies.
This is directory which stores Go modules with pinned buildable package that is used within this repository, managed by https://github.com/bwplotka/bingo.
- Run `bingo get` to install all tools having each own module file in this directory.
- Run `bingo get <tool>` to install <tool> that have own module file in this directory.
- For Makefile: Make sure to put `include .bingo/Variables.mk` in your Makefile, then use $(<upper case tool name>) variable where <tool> is the .bingo/<tool>.mod.
- For shell: Run `source .bingo/variables.env` to source all environment variable for each tool.
- For go: Import `.bingo/variables.go` to for variable names.
- See https://github.com/bwplotka/bingo or -h on how to add, remove or change binaries dependencies.
## Requirements
- Go 1.14+

View File

@@ -1,31 +0,0 @@
# Auto generated binary variables helper managed by https://github.com/bwplotka/bingo v0.5.1. DO NOT EDIT.
# All tools are designed to be build inside $GOBIN.
BINGO_DIR := $(dir $(lastword $(MAKEFILE_LIST)))
GOPATH ?= $(shell go env GOPATH)
GOBIN ?= $(firstword $(subst :, ,${GOPATH}))/bin
GO ?= $(shell which go)
# Below generated variables ensure that every time a tool under each variable is invoked, the correct version
# will be used; reinstalling only if needed.
# For example for drone variable:
#
# In your main Makefile (for non array binaries):
#
#include .bingo/Variables.mk # Assuming -dir was set to .bingo .
#
#command: $(DRONE)
# @echo "Running drone"
# @$(DRONE) <flags/args..>
#
DRONE := $(GOBIN)/drone-v1.4.0
$(DRONE): $(BINGO_DIR)/drone.mod
@# Install binary/ries using Go 1.14+ build command. This is using bwplotka/bingo-controlled, separate go module with pinned dependencies.
@echo "(re)installing $(GOBIN)/drone-v1.4.0"
@cd $(BINGO_DIR) && $(GO) build -mod=mod -modfile=drone.mod -o=$(GOBIN)/drone-v1.4.0 "github.com/drone/drone-cli/drone"
WIRE := $(GOBIN)/wire-v0.5.0
$(WIRE): $(BINGO_DIR)/wire.mod
@# Install binary/ries using Go 1.14+ build command. This is using bwplotka/bingo-controlled, separate go module with pinned dependencies.
@echo "(re)installing $(GOBIN)/wire-v0.5.0"
@cd $(BINGO_DIR) && $(GO) build -mod=mod -modfile=wire.mod -o=$(GOBIN)/wire-v0.5.0 "github.com/google/wire/cmd/wire"

View File

@@ -1,7 +0,0 @@
module _ // Auto generated by https://github.com/bwplotka/bingo. DO NOT EDIT
go 1.17
replace github.com/docker/docker => github.com/docker/engine v17.12.0-ce-rc1.0.20200309214505-aa6a9891b09c+incompatible
require github.com/drone/drone-cli v1.4.0 // drone

View File

@@ -1 +0,0 @@
module _ // Fake go.mod auto-created by 'bingo' for go -moddir compatibility with non-Go projects. Commit this file, together with other .mod files.

View File

@@ -1,14 +0,0 @@
# Auto generated binary variables helper managed by https://github.com/bwplotka/bingo v0.5.1. DO NOT EDIT.
# All tools are designed to be build inside $GOBIN.
# Those variables will work only until 'bingo get' was invoked, or if tools were installed via Makefile's Variables.mk.
GOBIN=${GOBIN:=$(go env GOBIN)}
if [ -z "$GOBIN" ]; then
GOBIN="$(go env GOPATH)/bin"
fi
DRONE="${GOBIN}/drone-v1.4.0"
WIRE="${GOBIN}/wire-v0.5.0"

View File

@@ -1,5 +0,0 @@
module _ // Auto generated by https://github.com/bwplotka/bingo. DO NOT EDIT
go 1.16
require github.com/google/wire v0.5.0 // cmd/wire

View File

@@ -1,22 +1,19 @@
[run]
init_cmds = [
["make", "gen-go"],
["GO_BUILD_DEV=1", "make", "build-cli"],
["GO_BUILD_DEV=1", "make", "build-server"],
["./bin/grafana-server", "-packaging=dev", "cfg:app_mode=development"]
["go", "run", "-mod=vendor", "build.go", "-dev", "build-cli"],
["go", "run", "-mod=vendor", "build.go", "-dev", "build-server"],
["./bin/grafana-server", "-packaging=dev", "cfg:app_mode=development"]
]
watch_all = true
follow_symlinks = true
watch_dirs = [
"$WORKDIR/pkg",
"$WORKDIR/public/views",
"$WORKDIR/conf",
"$WORKDIR/pkg",
"$WORKDIR/public/views",
"$WORKDIR/conf",
]
watch_exts = [".go", ".ini", ".toml", ".template.html"]
ignore_files = ["wire_gen.go"]
build_delay = 1500
cmds = [
["make", "gen-go"],
["GO_BUILD_DEV=1", "make", "build-server"],
["./bin/grafana-server", "-packaging=dev", "cfg:app_mode=development"]
["go", "run", "-mod=vendor", "build.go", "-dev", "build-server"],
["./bin/grafana-server", "-packaging=dev", "cfg:app_mode=development"]
]

View File

@@ -1,15 +1,4 @@
[dev]
last 1 chrome versions
last 1 firefox versions
last 1 safari versions
[production]
last 2 Firefox versions
last 2 Chrome versions
last 2 Safari versions
last 2 Edge versions
last 1 ios_saf versions
last 1 and_chr versions
last 1 samsung versions
>1%,
Chrome > 20
last 4 versions,
Firefox ESR

1485
.circleci/config.yml Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -12,14 +12,7 @@ Dockerfile
docs
dump.rdb
node_modules
**/node_modules
/local
/tmp
*.yml
!.yarnrc.yml
*.md
.yarn/*
!.yarn/patches
!.yarn/releases
!.yarn/plugins
!.yarn/versions
!.yarn/cache

View File

@@ -1,21 +0,0 @@
# To generate the .drone.yml file:
# 1. Modify the *.star definitions
# 2. Login to drone and export the env variables (token and server) shown here: https://drone.grafana.net/account
# 3. Run `make drone`
# More information about this process here: https://github.com/grafana/deployment_tools/blob/master/docs/infrastructure/drone/signing.md
load('scripts/drone/pipelines/pr.star', 'pr_pipelines')
load('scripts/drone/pipelines/main.star', 'main_pipelines')
load('scripts/drone/pipelines/docs.star', 'docs_pipelines')
load('scripts/drone/pipelines/release.star', 'release_pipelines', 'publish_image_pipelines', 'publish_artifacts_pipelines', 'publish_npm_pipelines', 'publish_packages_pipeline')
load('scripts/drone/version.star', 'version_branch_pipelines')
load('scripts/drone/pipelines/cron.star', 'cronjobs')
load('scripts/drone/vault.star', 'secrets')
def main(ctx):
edition = 'oss'
return pr_pipelines(edition=edition) + main_pipelines(edition=edition) + release_pipelines() + \
publish_image_pipelines('public') + publish_image_pipelines('security') + \
publish_artifacts_pipelines('security') + publish_artifacts_pipelines('public') + \
publish_npm_pipelines('public') + publish_packages_pipeline() + \
version_branch_pipelines() + cronjobs(edition=edition) + secrets()

4686
.drone.yml

File diff suppressed because it is too large Load Diff

View File

@@ -25,6 +25,3 @@ trim_trailing_whitespace = false
[Makefile]
indent_style = tab
indent_size = 2
[*.star]
indent_size = 4

View File

@@ -1,25 +0,0 @@
.git
.github
.yarn
build
compiled
data
deployment_tools_config.json
devenv
dist
e2e/tmp
node_modules
pkg
public/lib/monaco
scripts/grafana-server/tmp
vendor
# TS generate from cue by cuetsy
**/*.gen.ts
# Auto-generated localisation files
public/locales/_build/
public/locales/**/*.js
# Auto-generated icon file
packages/grafana-ui/src/components/Icon/iconBundle.ts

View File

@@ -1,40 +1,4 @@
{
"extends": ["@grafana/eslint-config"],
"root": true,
"plugins": ["@emotion", "lodash", "jest", "import"],
"settings": {
"import/internal-regex": "^(app/)|(@grafana)",
"import/external-module-folders": ["node_modules", ".yarn"]
},
"rules": {
"react/prop-types": "off",
"@emotion/jsx-import": "error",
"lodash/import-scope": [2, "member"],
"jest/no-focused-tests": "error",
"import/order": [
"error",
{
"groups": [["builtin", "external"], "internal", "parent", "sibling", "index"],
"newlines-between": "always",
"alphabetize": { "order": "asc" }
}
]
},
"overrides": [
{
"files": ["packages/grafana-ui/src/components/uPlot/**/*.{ts,tsx}"],
"rules": {
"react-hooks/rules-of-hooks": "off",
"react-hooks/exhaustive-deps": "off"
}
},
{
"files": ["packages/grafana-ui/src/components/ThemeDemos/**/*.{ts,tsx}"],
"rules": {
"@emotion/jsx-import": "off",
"react/jsx-uses-react": "off",
"react/react-in-jsx-scope": "off"
}
}
]
"root": true
}

1
.gitattributes vendored
View File

@@ -1 +0,0 @@
* text=auto eol=lf

170
.github/CODEOWNERS vendored
View File

@@ -11,173 +11,11 @@
# In each subsection folders are ordered first by depth, then alphabetically.
# This should make it easy to add new rules without breaking existing ones.
# Documentation owner: Jita Chatterjee
/docs/ @grafana/docs-squad @pkolyvas
/contribute/ @marcusolsson @grafana/docs-squad @pkolyvas
/docs/sources/developers/plugins/ @marcusolsson @grafana/docs-squad @grafana/plugins-platform-frontend @grafana/plugins-platform-backend
/docs/sources/developers/plugins/backend @marcusolsson @grafana/docs-squad @grafana/plugins-platform-backend
/docs/sources/enterprise/ @osg-grafana @grafana/docs-squad
# Documentation owner: Diana Payton
/docs/ @oddlittlebird
/contribute/ @oddlittlebird @marcusolsson
# Backend code
*.go @grafana/backend-platform
go.mod @grafana/backend-platform
go.sum @grafana/backend-platform
/.bingo @grafana/backend-platform
# Continuous Integration
.drone.yml @grafana/grafana-release-eng
.drone.star @grafana/grafana-release-eng
/scripts/drone/ @grafana/grafana-release-eng
/pkg/build/ @grafana/grafana-release-eng
# Cloud Datasources backend code
/pkg/tsdb/cloudwatch @grafana/cloud-datasources
/pkg/tsdb/azuremonitor @grafana/cloud-datasources
/pkg/tsdb/cloudmonitoring @grafana/cloud-datasources
# Observability backend code
/pkg/tsdb/prometheus @grafana/observability-metrics
/pkg/tsdb/influxdb @grafana/observability-metrics
/pkg/tsdb/elasticsearch @grafana/observability-logs-and-traces
/pkg/tsdb/graphite @grafana/observability-metrics
/pkg/tsdb/jaeger @grafana/observability-logs-and-traces
/pkg/tsdb/loki @grafana/observability-logs-and-traces
/pkg/tsdb/zipkin @grafana/observability-logs-and-traces
/pkg/tsdb/tempo @grafana/observability-logs-and-traces
# BI backend code
/pkg/tsdb/mysql @grafana/grafana-bi-squad
/pkg/tsdb/postgres @grafana/grafana-bi-squad
/pkg/tsdb/mssql @grafana/grafana-bi-squad
# Database migrations
/pkg/services/sqlstore/migrations @grafana/backend-platform @grafana/hosted-grafana-team
*_mig.go @grafana/backend-platform @grafana/hosted-grafana-team
# Grafana edge
/pkg/services/live/ @grafana/grafana-edge-squad
/pkg/services/searchV2/ @grafana/grafana-edge-squad
/pkg/services/store/ @grafana/grafana-edge-squad
/pkg/services/export/ @grafana/grafana-edge-squad
/pkg/infra/filestore/ @grafana/grafana-edge-squad
pkg/tsdb/testdatasource/sims/ @grafana/grafana-edge-squad
# Alerting
/pkg/services/ngalert @grafana/alerting-squad-backend
/pkg/services/sqlstore/migrations/ualert @grafana/alerting-squad-backend
/pkg/services/alerting @grafana/alerting-squad-backend
/pkg/tests/api/alerting @grafana/alerting-squad-backend
/public/app/features/alerting @grafana/alerting-squad-frontend
# Library Services
/pkg/services/libraryelements @grafana/user-essentials
/pkg/services/librarypanels @grafana/user-essentials
# Plugins
/pkg/api/pluginproxy @grafana/plugins-platform-backend
/pkg/plugins @grafana/plugins-platform-backend
/pkg/services/datasourceproxy @grafana/plugins-platform-backend
/pkg/services/datasources @grafana/plugins-platform-backend
# Dashboard previews / crawler (behind feature flag)
/pkg/services/thumbs @grafana/grafana-edge-squad
# Backend code docs
/contribute/style-guides/backend.md @grafana/backend-platform
/contribute/architecture/backend @grafana/backend-platform
/contribute/engineering/backend @grafana/backend-platform
/e2e @grafana/user-essentials
/packages @grafana/user-essentials @grafana/plugins-platform-frontend @grafana/grafana-bi-squad
/packages/grafana-e2e-selectors @grafana/user-essentials
/packages/grafana-e2e @grafana/user-essentials
/packages/grafana-toolkit @grafana/plugins-platform-frontend
/packages/grafana-ui/.storybook @grafana/plugins-platform-frontend
/packages/grafana-ui/src/components/DateTimePickers @grafana/grafana-bi-squad
/packages/grafana-ui/src/components/GraphNG @grafana/grafana-bi-squad
/packages/grafana-ui/src/components/Table @grafana/grafana-bi-squad
/packages/grafana-ui/src/components/TimeSeries @grafana/grafana-bi-squad
/packages/grafana-ui/src/components/uPlot @grafana/grafana-bi-squad
/packages/grafana-ui/src/utils/storybook @grafana/plugins-platform-frontend
/packages/jaeger-ui-components/ @grafana/observability-logs-and-traces
/plugins-bundled @grafana/plugins-platform-frontend
# public folder
/public/app/core/components/TimePicker @grafana/grafana-bi-squad
/public/app/core/components/Layers @grafana/grafana-edge-squad
/public/app/features/canvas/ @grafana/grafana-edge-squad
/public/app/features/comments/ @grafana/grafana-edge-squad
/public/app/features/dimensions/ @grafana/grafana-edge-squad
/public/app/features/geo/ @grafana/grafana-edge-squad
/public/app/features/live/ @grafana/grafana-edge-squad
/public/app/features/explore/ @grafana/observability-experience-squad
/public/app/features/plugins @grafana/plugins-platform-frontend
/public/app/features/transformers/spatial @grafana/grafana-edge-squad
/public/app/plugins/panel/alertlist @grafana/alerting-squad
/public/app/plugins/panel/barchart @grafana/grafana-bi-squad
/public/app/plugins/panel/heatmap @grafana/grafana-bi-squad
/public/app/plugins/panel/histogram @grafana/grafana-bi-squad
/public/app/plugins/panel/logs @grafana/observability-logs-and-traces
/public/app/plugins/panel/nodeGraph @grafana/observability-logs-and-traces
/public/app/plugins/panel/piechart @grafana/grafana-bi-squad
/public/app/plugins/panel/state-timeline @grafana/grafana-bi-squad
/public/app/plugins/panel/status-history @grafana/grafana-bi-squad
/public/app/plugins/panel/table @grafana/grafana-bi-squad
/public/app/plugins/panel/timeseries @grafana/grafana-bi-squad
/public/app/plugins/panel/geomap @grafana/grafana-edge-squad
/public/app/plugins/panel/canvas @grafana/grafana-edge-squad
/public/app/plugins/panel/candlestick @grafana/grafana-edge-squad
/public/app/plugins/panel/icon @grafana/grafana-edge-squad
/scripts/build/release-packages.sh @grafana/plugins-platform-frontend
/scripts/circle-release-next-packages.sh @grafana/plugins-platform-frontend
/scripts/ci-frontend-metrics.sh @grafana/user-essentials @grafana/plugins-platform-frontend @grafana/grafana-bi-squad
/scripts/ci-reference-docs-build.sh @grafana/plugins-platform-frontend
/scripts/ci-reference-docs-lint.sh @grafana/plugins-platform-frontend
/scripts/grunt @grafana/frontend-ops
/scripts/webpack @grafana/frontend-ops
/scripts/generate-a11y-report.sh @grafana/user-essentials
package.json @grafana/frontend-ops
tsconfig.json @grafana/frontend-ops
lerna.json @grafana/frontend-ops
.babelrc @grafana/frontend-ops
.prettierrc.js @grafana/frontend-ops
.eslintrc @grafana/frontend-ops
.pa11yci.conf.js @grafana/user-essentials
.pa11yci-pr.conf.js @grafana/user-essentials
# @grafana/ui component documentation
*.mdx @marcusolsson @jessover9000 @grafana/plugins-platform-frontend
# Core datasources
/public/app/plugins/datasource/cloudwatch @grafana/cloud-datasources
/public/app/plugins/datasource/elasticsearch @grafana/observability-logs-and-traces
/public/app/plugins/datasource/grafana-azure-monitor-datasource @grafana/cloud-datasources
/public/app/plugins/datasource/graphite @grafana/observability-metrics
/public/app/plugins/datasource/influxdb @grafana/observability-metrics
/public/app/plugins/datasource/jaeger @grafana/observability-logs-and-traces
/public/app/plugins/datasource/loki @grafana/observability-logs-and-traces
/public/app/plugins/datasource/mssql @grafana/grafana-bi-squad
/public/app/plugins/datasource/mysql @grafana/grafana-bi-squad
/public/app/plugins/datasource/opentsdb @grafana/backend-platform
/public/app/plugins/datasource/postgres @grafana/grafana-bi-squad
/public/app/plugins/datasource/prometheus @grafana/observability-metrics
/public/app/plugins/datasource/cloud-monitoring @grafana/cloud-datasources
/public/app/plugins/datasource/zipkin @grafana/observability-logs-and-traces
/public/app/plugins/datasource/tempo @grafana/observability-logs-and-traces
/public/app/plugins/datasource/alertmanager @grafana/alerting-squad
# Cloud middleware
/grafana-mixin/ @grafana/hosted-grafana-team
# Grafana authentication and authorization
/pkg/services/accesscontrol @grafana/grafana-authnz-team
/pkg/services/auth @grafana/grafana-authnz-team
/pkg/services/dashboards/accesscontrol.go @grafana/grafana-authnz-team
/pkg/services/datasources/permissions @grafana/grafana-authnz-team
/pkg/services/datasources/permissions/accesscontrol.go @grafana/grafana-authnz-team
/pkg/services/guardian @grafana/grafana-authnz-team
/pkg/services/ldap @grafana/grafana-authnz-team
/pkg/services/login @grafana/grafana-authnz-team
/pkg/services/multildap @grafana/grafana-authnz-team
/pkg/services/oauthtoken @grafana/grafana-authnz-team
/pkg/services/teamguardian @grafana/grafana-authnz-team
/pkg/services/serviceaccounts @grafana/grafana-authnz-team
go.sum @grafana/backend-platform

View File

@@ -5,13 +5,9 @@ labels: 'type: bug'
---
<!--
Please use this template to create your bug report. By providing as much info as possible you help us understand the issue, reproduce it and resolve it for you quicker. Therefore take a couple of extra minutes to make sure you have provided all info needed.
PROTIP: record your screen and attach it as a gif to showcase the issue.
- Questions should be posted to: https://community.grafana.com
- Use query inspector to troubleshoot issues: https://bit.ly/2XNF6YS
- How to record and attach gif: https://bit.ly/2Mi8T6K
Please use this template while reporting a bug and provide as much info as possible.
Questions should be posted to https://community.grafana.com
Use query inspector to troubleshoot issues: https://community.grafana.com/t/using-grafanas-query-inspector-to-troubleshoot-issues/2630
-->
**What happened**:

View File

@@ -0,0 +1,11 @@
---
name: Enhancement request
about: Suggest an enhancement or new feature for the Grafana project
labels: 'type: feature request'
---
<!-- Please only use this template for submitting feature requests -->
**What would you like to be added**:
**Why is this needed**:

View File

@@ -1,39 +0,0 @@
---
name: '@grafana/ui component request'
about: Suggest a component for the @grafana/ui package
labels: 'area/grafana/ui'
---
<!--
By using this template you will make it easier for us to make sure that documentation and implementation stays up to date for every component in @grafana/ui
Thank you!
-->
**Why is this component needed**:
<!-- Explain your use case -->
___
- [ ] Is/could it be used in more than one place in Grafana?
**Where is/could it be used?**:
___
- [ ] Post screenshots possible.
- [ ] It has a single use case.
- [ ] It is/could be used in multiple places.
**Implementation** (Checklist meant for the person implementing the component)
- [ ] Component has a story in Storybook.
- [ ] Props and naming follows [our style guide](https://github.com/grafana/grafana/blob/main/contribute/style-guides/frontend.md).
- [ ] It is extendable (rest props are spread, styles with className work, and so on).
- [ ] Uses [theme for spacing, colors, and so on](https://github.com/grafana/grafana/blob/main/contribute/style-guides/themes.md).
- [ ] Works with both light and dark theme.
**Documentation**
- [ ] Properties are documented.
- [ ] Use cases are described.
- [ ] Code examples for the different use cases.
- [ ] Dos and don'ts.
- [ ] Styling guidelines, specific color usage (if applicable).

View File

@@ -1,43 +0,0 @@
name: UX design issue
description: Create an issue for delivering wireframes, mockups or other design solutions.
title: "UX: "
labels: ["type/ux"]
body:
- type: textarea
id: background
attributes:
label: "Background / Why we're doing this"
description: Describe the problem and background of the issue. This could include research insights that inform the design changes, unmet user needs, or other usability issues.
placeholder: Add UI improvements to make Grafana Alerting alert creation easier based on usability test results.
validations:
required: true
- type: dropdown
attributes:
label: Is there existing research for this?
description: Please link research results or insights in the Background section if you have any. If no research was conducted, you might want to consider usability testing your design later.
options: [
"Yes, I have linked it",
"No research yet"
]
validations:
required: true
- type: textarea
id: problems-or-tasks
attributes:
label: Problems or tasks
description: Describe problems the new design should solve or tasks the user needs to complete.
placeholder:
value: |
- A problem we're trying to solve
- A task the user needs to accomplish
- …
validations:
required: false
- type: textarea
attributes:
label: Deliverables
description: Add a checklist of deliverables here. You can later add links to each deliverable.
value: |
- Figma mockup
- Miro board
- …

View File

@@ -1,8 +1,5 @@
blank_issues_enabled: false
contact_links:
- name: Feature Request
url: https://github.com/grafana/grafana/discussions/new
about: Discuss ideas for new features of changes
- name: Questions & Help
url: https://community.grafana.com
about: Please ask and answer questions here.

View File

@@ -2,7 +2,7 @@
Thank you for sending a pull request! Here are some tips:
1. If this is your first time, please read our contribution guide at https://github.com/grafana/grafana/blob/main/CONTRIBUTING.md
1. If this is your first time, please read our contribution guide at https://github.com/grafana/grafana/blob/master/CONTRIBUTING.md
2. Ensure you include and run the appropriate tests as part of your Pull Request.
@@ -10,7 +10,7 @@ Thank you for sending a pull request! Here are some tips:
4. If the Pull Request is a work in progress, make use of GitHub's "Draft PR" feature and mark it as such.
5. If you can not merge your Pull Request due to a merge conflict, Rebase it. This gets it in sync with the main branch.
5. If you can not merge your Pull Request due to a merge conflict, Rebase it. This gets it in sync with the master branch.
6. Name your PR as "<FeatureArea>: Describe your change", e.g. Alerting: Prevent race condition. If it's a fix or feature relevant for the changelog describe the user impact in the title. The PR title is used to auto-generate the changelog for issues marked with the "add to changelog" label.
@@ -22,7 +22,7 @@ Thank you for sending a pull request! Here are some tips:
<!--
- Automatically closes linked issue when the Pull Request is merged.
* Automatically closes linked issue when the Pull Request is merged.
Usage: "Fixes #<issue number>", or "Fixes (paste link of issue)"

View File

@@ -0,0 +1,7 @@
FROM alpine
RUN apk update
RUN apk add rsync git bash
COPY entrypoint.sh /entrypoint.sh
ENTRYPOINT ["/bin/bash", "/entrypoint.sh"]

View File

@@ -0,0 +1,21 @@
MIT License
Copyright (c) 2019 Sean Middleditch
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

View File

@@ -0,0 +1,60 @@
publish-to-git
==============
[GitHub Action](https://github.com/features/actions) for publishing a directory
and its contents to another git repository.
This can be especially useful for publishing static website, such as with
[GitHub Pages](https://pages.github.com/), from built files in other job
steps, such as [Doxygen](http://www.doxygen.nl/) generated HTML files.
**NOTE**: GitHub currently requires the use of a Personal Access Token for
pushing to other repositories. Pushing to the current repository should work
with the always-available GitHub Token (available via
`{{ secrets.GITHUB_TOKEN }}`. If pushing to another repository, a Personal
Access Token will need to be [created](https://help.github.com/en/articles/creating-a-personal-access-token-for-the-command-line) and assigned to the
workflow [secrets](https://help.github.com/en/articles/virtual-environments-for-github-actions#creating-and-using-secrets-encrypted-variables).
Inputs
------
- `repository`: Destination repository (default: current repository).
- `branch`: Destination branch (required).
- `host`: Destination git host (default: `github.com`).
- `github_token`: GitHub Token (required; use `secrets.GITHUB_TOKEN`).
- `github_pat`: Personal Access Token or other https credentials.
- `source_folder`: Source folder in workspace to copy (default: workspace root).
- `target_folder`: Target folder in destination branch to copy to (default: repository root).
- `commit_author`: Override commit author (default: `{github.actor}@users.noreply.github.com`).
- `commit_message`: Set commit message (default: `[workflow] Publish from [repository]:[branch]/[folder]`).
- `dry_run`: Does not push if non-empty (default: empty).
- `working_directory`: Location to checkout repository (default: random location in `${HOME}`)
Outputs
-------
- `commit_hash`: SHA hash of the new commit.
- `working_directory`: Working directory of git clone of repository.
License
-------
MIT License. See [LICENSE](LICENSE) for details.
Usage Example
-------------
```yaml
jobs:
publish:
- uses: actions/checkout@master
- run: |
sh scripts/build-doxygen-html.sh --out static/html
- uses: seanmiddleditch/gha-publish-to-git@master
with:
branch: gh-pages
github_token: '${{ secrets.GITHUB_TOKEN }}'
github_pat: '${{ secrets.GH_PAT }}'
source_folder: static/html
if: success() && github.event == 'push'
```

View File

@@ -0,0 +1,60 @@
---
name: publish-to-git
description: 'Publish files to a git repository'
branding:
icon: 'git-commit'
color: 'blue'
inputs:
repository:
description: 'Destination repository (default: current repository)'
default: ''
branch:
description: 'Destination branch'
required: true
host:
description: 'Destination git host'
default: 'github.com'
github_token:
description: 'GitHub Token (use `secrets.GITHUB_TOKEN`)'
required: true
github_pat:
description: 'Personal Access Token or other https credentials'
default: ''
source_folder:
description: 'Source folder in workspace to copy (default: workspace root)'
defaault: ''
target_folder:
description: 'Target folder in destination branch to copy to (default: repository root)'
default: ''
commit_author:
description: 'User Name <email@address> (default: [github.actor]@users.noreply.github.com)'
default: ''
commit_message:
description: 'Commit message (default: [workflow] Publish from [repository]:[branch]/[folder])'
default: ''
dry_run:
description: 'Do not push to repository (set to non-empty string to make dry-run)'
default: ''
working_directory:
description: 'Working directory for clone (default: random location in `${HOME}`)'
default: ''
outputs:
commit_hash:
description: 'Hash of the new commit'
working_directory:
description: 'Working directory of temporary repository'
runs:
using: 'docker'
image: 'Dockerfile'
args:
- ${{ inputs.repository }}
- ${{ inputs.branch }}
- ${{ inputs.host }}
- ${{ inputs.github_token }}
- ${{ inputs.github_pat }}
- ${{ inputs.source_folder }}
- ${{ inputs.target_folder }}
- ${{ inputs.commit_author }}
- ${{ inputs.commit_message }}
- ${{ inputs.dry_run }}
- ${{ inputs.working_directory }}

View File

@@ -0,0 +1,99 @@
#/bin/bash
# Name the Docker inputs.
#
INPUT_REPOSITORY="$1"
INPUT_BRANCH="$2"
INPUT_HOST="$3"
INPUT_GITHUB_TOKEN="$4"
INPUT_GITHUB_PAT="$5"
INPUT_SOURCE_FOLDER="$6"
INPUT_TARGET_FOLDER="$7"
INPUT_COMMIT_AUTHOR="$8"
INPUT_COMMIT_MESSAGE="$9"
INPUT_DRYRUN="${10}"
INPUT_WORKDIR="${11}"
# Check for required inputs.
#
[ -z "$INPUT_BRANCH" ] && echo >&2 "::error::'branch' is required" && exit 1
[ -z "$INPUT_GITHUB_TOKEN" -a -z "$INPUT_GITHUB_PAT" ] && echo >&2 "::error::'github_token' or 'github_pat' is required" && exit 1
# Set state from inputs or defaults.
#
REPOSITORY="${INPUT_REPOSITORY:-${GITHUB_REPOSITORY}}"
BRANCH="${INPUT_BRANCH}"
HOST="${INPUT_GIT_HOST:-github.com}"
TOKEN="${INPUT_GITHUB_PAT:-${INPUT_GITHUB_TOKEN}}"
REMOTE="${INPUT_REMOTE:-https://${TOKEN}@${HOST}/${REPOSITORY}.git}"
SOURCE_FOLDER="${INPUT_SOURCE_FOLDER:-.}"
TARGET_FOLDER="${INPUT_TARGET_FOLDER}"
REF="${GITHUB_BASE_REF:-${GITHUB_REF}}"
REF_BRANCH=$(echo "${REF}" | rev | cut -d/ -f1 | rev)
[ -z "$REF_BRANCH" ] && echo 2>&1 "No ref branch" && exit 1
COMMIT_AUTHOR="${INPUT_AUTHOR:-${GITHUB_ACTOR} <${GITHUB_ACTOR}@users.noreply.github.com>}"
COMMIT_MESSAGE="${INPUT_COMMIT_MESSAGE:-[${GITHUB_WORKFLOW}] Publish from ${GITHUB_REPOSITORY}:${REF_BRANCH}/${SOURCE_FOLDER}}"
# Calculate the real source path.
#
SOURCE_PATH="$(realpath "${SOURCE_FOLDER}")"
[ -z "${SOURCE_PATH}" ] && exit 1
echo "::debug::SOURCE_PATH=${SOURCE_PATH}"
# Let's start doing stuff.
echo "Publishing ${SOURCE_FOLDER} to ${REMOTE}:${BRANCH}/${TARGET_FOLDER}"
# Create a working directory; the workspace may be filled with other important
# files.
#
WORK_DIR="${INPUT_WORKDIR:-$(mktemp -d "${HOME}/gitrepo.XXXXXX")}"
[ -z "${WORK_DIR}" ] && echo >&2 "::error::Failed to create temporary working directory" && exit 1
cd "${WORK_DIR}"
# Initialize git repo and configure for remote access.
#
echo "Initializing repository with remote ${REMOTE}"
git init || exit 1
git config --local user.email "${GITHUB_ACTOR}@users.noreply.github.com" || exit 1
git config --local user.name "${GITHUB_ACTOR}" || exit 1
git remote add origin "${REMOTE}" || exit 1
git remote -v
# Fetch initial (current contents).
#
echo "Fetching ${REMOTE}:${BRANCH}"
git fetch --depth 1 origin "${BRANCH}" || exit 1
git checkout -b "${BRANCH}" || exit 1
git pull origin "${BRANCH}" || exit 1
# Create the target directory (if necessary) and copy files from source.
#
TARGET_PATH="${WORK_DIR}/${TARGET_FOLDER}"
echo "Populating ${TARGET_PATH}"
mkdir -p "${TARGET_PATH}" || exit 1
rsync -a --quiet --delete "${SOURCE_PATH}/" "${TARGET_PATH}" || exit 1
# Create commit with changes.
#
echo "Creating commit"
git add "${TARGET_PATH}" || exit 1
git commit -m "${COMMIT_MESSAGE}" --author "${COMMIT_AUTHOR}" || exit 1
COMMIT_HASH="$(git rev-parse HEAD)"
echo "Created commit ${COMMIT_HASH}"
# Publish output variables.
#
echo "::set-output name=commit_hash::${COMMIT_HASH}"
echo "::set-output name=working_directory::${WORK_DIR}"
# Push if not a dry-run.
#
if [ -z "${INPUT_DRYRUN}" ] ; then
echo "Pushing to ${REMOTE}:${BRANCH}"
git push origin "${BRANCH}" || exit 1
else
echo "[DRY-RUN] Not pushing to ${REMOTE}:${BRANCH}"
fi

30
.github/bot.md vendored
View File

@@ -1,30 +0,0 @@
# GitHub & grafanabot automation
The bot is configured via [commands.json](https://github.com/grafana/grafana/blob/main/.github/commands.json) and some other GitHub workflows [workflows](https://github.com/grafana/grafana/tree/main/.github/workflows).
Comment commands:
* Write the word `/duplicate #<number>` anywhere in a comment and the bot will add the correct label and standard message.
* Write the word `/needsMoreInfo` anywhere in a comment and the bot will add the correct label and standard message.
Label commands:
* Add label `bot/question` the the bot will close with standard question message and add label `type/question`
* Add label `bot/duplicate` the the bot will close with standard duplicate message and add label `type/duplicate`
* Add label `bot/needs more info` for bot to request more info (or use comment command mentioned above)
* Add label `bot/close feature request` for bot to close a feature request with standard message and adds label `not implemented`
* Add label `bot/no new info` for bot to close an issue where we asked for more info but has not received any updates in at least 14 days.
## Metrics
Metrics are configured in [metrics-collector.json](https://github.com/grafana/grafana/blob/main/.github/metrics-collector.json) and are also defined in the
[metrics-collector](https://github.com/grafana/grafana-github-actions/blob/main/metrics-collector/index.ts) GitHub action.
## Backport PR
To automatically backport a PR to a release branch like v7.3.x add a label named `backport v7.3.x`. The label name should follow the pattern `backport <branch-name>`. Once merged grafanabot will automatically
try to cherry-pick the PR merge commit into that branch and open a PR. It will sync the milestone with the source PR so make sure the source PR also is assigned the milestone for the patch release. If the PR is already merged you can still add this label and trigger the backport automation.
If there are merge conflicts the bot will write a comment on the source PR saying the cherry-pick failed. In this case you have to do the cherry pick and backport PR manually.
The backport logic is written [here](https://github.com/grafana/grafana-github-actions/blob/main/backport/backport.ts)

221
.github/commands.json vendored
View File

@@ -1,221 +0,0 @@
[
{
"type":"label",
"name":"bot/question",
"addLabel":"type/question",
"removeLabel":"bot/question",
"action":"close",
"comment":"Please ask your question on [community.grafana.com/](https://community.grafana.com/). To avoid having your issue closed in the future, please read our [CONTRIBUTING](https://github.com/grafana/grafana/blob/main/CONTRIBUTING.md) guidelines.\n\nHappy graphing!"
},
{
"type":"comment",
"name":"duplicate",
"allowUsers":[],
"action":"updateLabels",
"addLabel":"type/duplicate"
},
{
"type":"label",
"name":"bot/duplicate",
"addLabel":"type/duplicate",
"removeLabel":"bot/duplicate",
"action":"close",
"comment":"Thanks for creating this issue! It looks like this has already been reported by another user. Weve closed this in favor of the existing one. Please consider adding any details you think is missing to that issue.\n\nTo avoid having your issue closed in the future, please read our [CONTRIBUTING](https://github.com/grafana/grafana/blob/main/CONTRIBUTING.md) guidelines.\n\nHappy graphing!"
},
{
"type":"comment",
"name":"needsMoreInfo",
"allowUsers":[],
"action":"updateLabels",
"addLabel":"bot/needs more info"
},
{
"type":"label",
"name":"bot/needs more info",
"action":"updateLabels",
"addLabel":"needs more info",
"removeLabel":"bot/needs more info",
"comment":"Thanks for creating this issue! We think it's missing some basic information. \r\n\r\nFollow the issue template and add additional information that will help us replicate the problem. \r\nFor data visualization issues: \r\n- Query results from the inspect drawer (data tab & query inspector)\r\n- Panel settings can be extracted in the panel inspect drawer JSON tab\r\n\r\nFor dashboard related issues: \r\n- Dashboard JSON can be found in the dashboard settings JSON model view\r\n\r\nFor authentication, provisioning and alerting issues, Grafana server logs are useful. \r\n\r\nHappy graphing!"
},
{
"type":"label",
"name":"bot/no new info",
"action":"close",
"comment":"We've closed this issue since it needs more information and hasn't had any activity recently. We can re-open it after you you add more information. To avoid having your issue closed in the future, please read our [CONTRIBUTING](https://github.com/grafana/grafana/blob/main/CONTRIBUTING.md) guidelines.\n\nHappy graphing!"
},
{
"type":"label",
"name":"bot/close feature request",
"action":"close",
"addLabel":"not implemented",
"comment":"This feature request has been open for a long time with few received upvotes or comments, so we are closing it. We're trying to limit open GitHub issues in order to better track planned work and features. \r\n\r\nThis doesn't mean that we'll never ever implement it or that we will never accept a PR for it. A closed issue can still attract upvotes and act as a ticket to track feature demand\/interest. \r\n\r\nThank You to you for taking the time to create this issue!"
},
{
"type":"label",
"name":"oss-user-essentials",
"action":"addToProject",
"addToProject":{
"url":"https://github.com/orgs/grafana/projects/78"
}
},
{
"type":"label",
"name":"area/plugins-catalog",
"action":"addToProject",
"addToProject":{
"url":"https://github.com/orgs/grafana/projects/76"
}
},
{
"type":"label",
"name":"type/docs",
"action":"addToProject",
"addToProject":{
"url":"https://github.com/orgs/grafana/projects/69"
}
},
{
"type":"label",
"name":"datasource/Azure",
"action":"addToProject",
"addToProject":{
"url":"https://github.com/orgs/grafana/projects/97"
}
},
{
"type":"label",
"name":"datasource/CloudWatch",
"action":"addToProject",
"addToProject":{
"url":"https://github.com/orgs/grafana/projects/97"
}
},
{
"type":"label",
"name":"datasource/CloudWatch Logs",
"action":"addToProject",
"addToProject":{
"url":"https://github.com/orgs/grafana/projects/97"
}
},
{
"type":"label",
"name":"datasource/GoogleCloudMonitoring",
"action":"addToProject",
"addToProject":{
"url":"https://github.com/orgs/grafana/projects/97"
}
},
{
"type":"label",
"name":"datasource/Prometheus",
"action":"addToProject",
"addToProject":{
"url":"https://github.com/orgs/grafana/projects/112"
}
},
{
"type":"label",
"name":"datasource/InfluxDB",
"action":"addToProject",
"addToProject":{
"url":"https://github.com/orgs/grafana/projects/112"
}
},
{
"type":"label",
"name":"datasource/Graphite",
"action":"addToProject",
"addToProject":{
"url":"https://github.com/orgs/grafana/projects/112"
}
},
{
"type":"label",
"name":"datasource/OpenTSDB",
"action":"addToProject",
"addToProject":{
"url":"https://github.com/orgs/grafana/projects/112"
}
},
{
"type":"label",
"name":"datasource/OpenSearch",
"action":"addToProject",
"addToProject":{
"url":"https://github.com/orgs/grafana/projects/110"
}
},
{
"type":"label",
"name":"datasource/Loki",
"action":"addToProject",
"addToProject":{
"url":"https://github.com/orgs/grafana/projects/110"
}
},
{
"type":"label",
"name":"datasource/Tempo",
"action":"addToProject",
"addToProject":{
"url":"https://github.com/orgs/grafana/projects/110"
}
},
{
"type":"label",
"name":"datasource/Elasticsearch",
"action":"addToProject",
"addToProject":{
"url":"https://github.com/orgs/grafana/projects/110"
}
},
{
"type":"label",
"name":"datasource/Jaeger",
"action":"addToProject",
"addToProject":{
"url":"https://github.com/orgs/grafana/projects/110"
}
},
{
"type":"label",
"name":"datasource/Zipkin",
"action":"addToProject",
"addToProject":{
"url":"https://github.com/orgs/grafana/projects/110"
}
},
{
"type":"label",
"name":"area/explore",
"action":"addToProject",
"addToProject":{
"url":"https://github.com/orgs/grafana/projects/111"
}
},
{
"type":"label",
"name":"oss-user-essentials",
"action":"removeFromProject",
"removeFromProject":{
"url":"https://github.com/orgs/grafana/projects/78"
}
},
{
"type":"label",
"name":"oss-user-essentials",
"action":"removeFromProject",
"removeFromProject":{
"url":"https://github.com/grafana/grafana/projects/33"
}
},
{
"type": "label",
"name": "team/grafana-partners",
"action": "addToProject",
"addToProject": {
"url": "https://github.com/orgs/grafana/projects/87"
}
}
]

View File

@@ -1,10 +0,0 @@
version: 2
updates:
- package-ecosystem: "github-actions"
directory: "/"
schedule:
interval: "daily"
- package-ecosystem: "gomod"
directory: "/"
schedule:
interval: "weekly"

View File

@@ -1,32 +0,0 @@
{
"queries": [
{
"name": "type_bug",
"query": "label:\"type/bug\" is:open"
},
{
"name": "type_docs",
"query": "label:\"type/docs\" is:open"
},
{
"name": "needs_investigation",
"query": "label:\"needs investigation\" is:open"
},
{
"name": "needs_more_info",
"query": "label:\"needs more info\" is:open"
},
{
"name": "unlabeled",
"query": "is:open is:issue no:label"
},
{
"name": "open_prs",
"query": "is:open is:pr"
},
{
"name": "milestone_7_4_open",
"query": "is:open is:issue milestone:7.4"
}
]
}

View File

@@ -1,49 +0,0 @@
[
{
"type": "check-milestone",
"title": "Milestone Check",
"targetUrl": "https://github.com/grafana/grafana/blob/main/contribute/merge-pull-request.md#assign-a-milestone",
"success": "Milestone set",
"failure": "Milestone not set"
},
{
"type": "check-label",
"title": "Backport Check",
"labels": {
"exists": "Backport enabled",
"notExists": "Backport decision needed",
"matches": [
"backport v*"
]
},
"skip": {
"message": "Backport skipped",
"matches": [
"backport",
"no-backport"
]
},
"targetUrl": "https://github.com/grafana/grafana/blob/main/contribute/merge-pull-request.md#should-the-pull-request-be-backported"
},
{
"type": "check-changelog",
"title": "Changelog Check",
"labels": {
"exists": "Changelog enabled",
"notExists": "Changelog decision needed",
"matches": [
"add to changelog"
]
},
"breakingChangeLabels": [
"breaking change"
],
"skip": {
"message": "Changelog skipped",
"matches": [
"no-changelog"
]
},
"targetUrl": "https://github.com/grafana/grafana/blob/main/contribute/merge-pull-request.md#include-in-changelog-and-release-notes"
}
]

View File

@@ -1,196 +0,0 @@
[
{
"type": "changedfiles",
"matches": [
"docs/**/*",
"contribute/**/*"
],
"action": "updateLabel",
"addLabel": "type/docs"
},
{
"type": "changedfiles",
"matches": [
"public/**/*",
"packages/**/*",
"e2e/**/*",
"plugins-bundled/**/*",
"scripts/build/release-packages.sh",
"scripts/circle-release-next-packages.sh",
"scripts/ci-frontend-metrics.sh",
"scripts/grunt/**/*",
"scripts/webpack/**/*",
"package.json",
"tsconfig.json",
"lerna.json",
".babelrc",
".prettierrc.js",
".eslintrc",
"**/*.mdx"
],
"action": "updateLabel",
"addLabel": "area/frontend"
},
{
"type": "changedfiles",
"matches": [
"**/*.go",
"go.mod",
"go.sum",
"contribute/style-guides/backend.md",
"contribute/architecture/backend/**/*",
"scripts/go/**/*"
],
"action": "updateLabel",
"addLabel": "area/backend"
},
{
"type": "changedfiles",
"matches": [
"pkg/services/sqlstore/migrations/**/*",
"**/*_mig.go"
],
"action": "updateLabel",
"addLabel": "area/backend/db/migration"
},
{
"type": "changedfiles",
"matches": [ "public/app/features/explore/**/*"],
"action": "updateLabel",
"addLabel": "area/explore"
},
{
"type": "changedfiles",
"matches": [
".circleci/**/*",
"packaging/**/*",
"scripts/build/**/*",
"scripts/*.sh",
"Makefile",
"Dockerfile",
"Dockerfile.ubuntu"
],
"action": "updateLabel",
"addLabel": "type/build-packaging"
},
{
"type": "changedfiles",
"matches": [
"scripts/*.star",
".drone.star",
".drone.yml"
],
"action": "updateLabel",
"addLabel": "type/ci"
},
{
"type": "changedfiles",
"matches": [ "public/app/plugins/datasource/grafana-azure-monitor-datasource/**/*", "pkg/tsdb/azuremonitor/**/*"],
"action": "updateLabel",
"addLabel": "datasource/Azure"
},
{
"type": "changedfiles",
"matches": [ "public/app/plugins/datasource/cloud-monitoring/**/*", "pkg/tsdb/cloudmonitoring/**/*"],
"action": "updateLabel",
"addLabel": "datasource/GoogleCloudMonitoring"
},
{
"type": "changedfiles",
"matches": [ "public/app/plugins/datasource/cloudwatch/**/*", "pkg/tsdb/cloudwatch/**/*"],
"action": "updateLabel",
"addLabel": "datasource/CloudWatch"
},
{
"type": "changedfiles",
"matches": [ "public/app/plugins/datasource/elasticsearch/**/*", "pkg/tsdb/elasticsearch/**/*"],
"action": "updateLabel",
"addLabel": "datasource/Elasticsearch"
},
{
"type": "changedfiles",
"matches": [ "public/app/plugins/datasource/graphite/**/*", "pkg/tsdb/graphite/**/*"],
"action": "updateLabel",
"addLabel": "datasource/Graphite"
},
{
"type": "changedfiles",
"matches": [ "public/app/plugins/datasource/influxdb/**/*", "pkg/tsdb/influx/**/*"],
"action": "updateLabel",
"addLabel": "datasource/InfluxDB"
},
{
"type": "changedfiles",
"matches": [ "public/app/plugins/datasource/jaeger"],
"action": "updateLabel",
"addLabel": "datasource/Jaeger"
},
{
"type": "changedfiles",
"matches": [ "public/app/plugins/datasource/loki/**/*", "pkg/tsdb/loki/**/*"],
"action": "updateLabel",
"addLabel": "datasource/Loki"
},
{
"type": "changedfiles",
"matches": [ "public/app/plugins/datasource/mssql/**/*", "pkg/tsdb/mssql/**/*"],
"action": "updateLabel",
"addLabel": "datasource/MSSQL"
},
{
"type": "changedfiles",
"matches": [ "public/app/plugins/datasource/mysql/**/*", "pkg/tsdb/mysql/**/*"],
"action": "updateLabel",
"addLabel": "datasource/MySQL"
},
{
"type": "changedfiles",
"matches": [ "public/app/plugins/datasource/opentsdb/**/*", "pkg/tsdb/opentsdb/**/*"],
"action": "updateLabel",
"addLabel": "datasource/OpenTSDB"
},
{
"type": "changedfiles",
"matches": [ "public/app/plugins/datasource/postgres/**/*", "pkg/tsdb/postgres/**/*"],
"action": "updateLabel",
"addLabel": "datasource/Postgres"
},
{
"type": "changedfiles",
"matches": [ "public/app/plugins/datasource/prometheus/**/*", "pkg/tsdb/prometheus/**/*"],
"action": "updateLabel",
"addLabel": "datasource/Prometheus"
},
{
"type": "changedfiles",
"matches": [ "public/app/plugins/datasource/tempo/**/*", "pkg/tsdb/tempo/**/*"],
"action": "updateLabel",
"addLabel": "datasource/Tempo"
},
{
"type": "changedfiles",
"matches": [ "public/app/plugins/datasource/zipkin/**/*"],
"action": "updateLabel",
"addLabel": "datasource/Zipkin"
},
{
"type": "changedfiles",
"matches": ["public/app/features/variables/**/*", "public/app/features/templating/**/*"],
"action": "updateLabel",
"addLabel": "area/dashboard/templating"
},
{
"type": "changedfiles",
"matches": ["/pkg/services/ngalert/**/*", "/pkg/services/sqlstore/migrations/ualert/**/*", "/pkg/services/alerting/**/*", "/public/app/features/alerting/**/*", "/pkg/tests/api/alerting/**/*"],
"action": "updateLabel",
"addLabel": "area/alerting"
},
{
"type": "author",
"name": "pr/external",
"notMemberOf": { "org": "grafana" },
"ignoreList": ["renovate[bot]","dependabot[bot]"],
"action": "updateLabel",
"addLabel": "pr/external"
}
]

View File

@@ -1,58 +0,0 @@
{
"extends": [
"config:base"
],
"enabledManagers": ["npm"],
"ignoreDeps": [
"@grafana/slate-react", // should be updated when the `slate` package is updated
"@types/systemjs",
"@types/d3-force", // we should bump this once we move to esm modules
"@types/d3-interpolate", // we should bump this once we move to esm modules
"@types/d3-scale-chromatic", // we should bump this once we move to esm modules
"@types/grafana__slate-react", // should be updated when the `slate` package is updated
"@types/react-icons", // jaeger-ui-components is being refactored to use @grafana/ui icons instead
"commander", // we are planning to remove this, so no need to update it
"d3",
"d3-force", // we should bump this once we move to esm modules
"d3-interpolate", // we should bump this once we move to esm modules
"d3-scale-chromatic", // we should bump this once we move to esm modules
"execa", // we should bump this once we move to esm modules
"history", // we should bump this together with react-router-dom
"@mdx-js/react", // storybook peer-depends on it's 1.x version, we should upgrade this when we upgrade storybook
"monaco-editor", // due to us exposing this via @grafana/ui/CodeEditor's props bumping can break plugins
"node-fetch", // we should bump this once we move to esm modules
"react-hook-form", // due to us exposing these hooks via @grafana/ui form components bumping can break plugins
"react-icons", // jaeger-ui-components is being refactored to use @grafana/ui icons instead
"react-router-dom", // we should bump this together with history
"slate",
"slate-plain-serializer",
"systemjs",
"copy-webpack-plugin", // try to upgrade with newer yarn release. Not working with 3.1.1
"ts-loader", // we should remove ts-loader and use babel-loader instead
"ora" // we should bump this once we move to esm modules
],
"ignorePaths": ["packages/grafana-toolkit/package.json", "emails/**", "plugins-bundled/**", "**/mocks/**"],
"labels": ["area/frontend", "dependencies"],
"packageRules": [
{
"matchUpdateTypes": ["patch"],
"excludePackagePatterns": ["@storybook"],
"extends": ["schedule:monthly"],
"groupName": "Monthly patch updates"
},
{
"matchPackagePatterns": ["@storybook"],
"extends": ["schedule:monthly"],
"groupName": "Storybook updates"
}
],
"pin": {
"enabled": false
},
"prConcurrentLimit": 10,
"reviewers": ["team:grafana/frontend-ops"],
"separateMajorMinor": false,
"vulnerabilityAlerts": {
"addLabels": ["area/security"]
}
}

47
.github/stale.yml vendored Normal file
View File

@@ -0,0 +1,47 @@
# Configuration for probot-stale - https://github.com/probot/stale
# General configuration
# Label to use when marking as stale
staleLabel: stale
# Pull request specific configuration
pulls:
# Number of days of inactivity before an Issue or Pull Request becomes stale
daysUntilStale: 14
# Number of days of inactivity before a stale Issue or Pull Request is closed.
# Set to false to disable. If disabled, issues still need to be closed manually, but will remain marked as stale.
daysUntilClose: 30
# Comment to post when marking as stale. Set to `false` to disable
markComment: >
This pull request has been automatically marked as stale because it has not had
activity in the last 2 weeks. It will be closed in 30 days if no further activity occurs. Please
feel free to give a status update now, ping for review, or re-open when it's ready.
Thank you for your contributions!
# Comment to post when closing a stale Issue or Pull Request.
closeComment: >
This pull request has been automatically closed because it has not had
activity in the last 30 days. Please feel free to give a status update now, ping for review, or re-open when it's ready.
Thank you for your contributions!
# Limit the number of actions per hour, from 1-30. Default is 30
limitPerRun: 1
exemptLabels:
- help wanted
- type/bug
- type/feature-request
- Epic
- no stalebot
# Issue specific configuration
issues:
limitPerRun: 1
daysUntilStale: 100000
daysUntilClose: 100000
markComment: >
This issue has been automatically marked as stale because it has not had activity in the
last 100 days. It will be closed in the next 100 days if no activity occurs.
Thank you for your contributions.
closeComment: >
This issue has been automatically closed because it has not had activity in the
last month and a half. If this issue is still valid, please ping a maintainer and ask them to check this again.
Thank you for your contributions.

View File

@@ -1,26 +0,0 @@
name: Backport PR Creator
on:
pull_request_target:
types:
- closed
- labeled
jobs:
main:
runs-on: ubuntu-latest
steps:
- name: Checkout Actions
uses: actions/checkout@v3
with:
repository: "grafana/grafana-github-actions"
path: ./actions
ref: main
- name: Install Actions
run: npm install --production --prefix ./actions
- name: Run backport
uses: ./actions/backport
with:
metricsWriteAPIKey: ${{secrets.GRAFANA_MISC_STATS_API_KEY}}
token: ${{secrets.GH_BOT_ACCESS_TOKEN}}
labelsToAdd: "backport"
title: "[{{base}}] {{originalTitle}}"

View File

@@ -1,94 +0,0 @@
name: Bump version
on:
workflow_dispatch:
inputs:
version:
required: true
default: '7.x.x'
workflow_call:
inputs:
version_call:
description: Needs to match, exactly, the name of a version
required: true
type: string
secrets:
token:
required: true
metricsWriteAPIKey:
required: true
env:
YARN_ENABLE_IMMUTABLE_INSTALLS: false
jobs:
main:
runs-on: ubuntu-latest
steps:
# This is a basic workflow to help you get started with Actions
- uses: actions-ecosystem/action-regex-match@v2.0.2
if: ${{ github.event.inputs.version != '' }}
id: regex-match
with:
text: ${{ github.event.inputs.version }}
regex: '^(\d+.\d+).\d+(?:-beta.\d+)?$'
- uses: actions-ecosystem/action-regex-match@v2.0.2
if: ${{ inputs.version_call != '' }}
id: regex-match-version-call
with:
text: ${{ inputs.version_call }}
regex: '^(\d+.\d+).\d+(?:-beta\d+)?$'
- name: Validate input version
if: ${{ steps.regex-match.outputs.match == '' && github.event.inputs.version != '' }}
run: |
echo "The input version format is not correct, please respect:\
major.minor.patch or major.minor.patch-beta.number format. \
example: 7.4.3 or 7.4.3-beta.1"
exit 1
- name: Validate input version call
if: ${{ inputs.version_call != '' && steps.regex-match-version-call.outputs.match == '' }}
run: |
echo "The input version format is not correct, please respect:\
major.minor.patch or major.minor.patch-beta<number> format. \
example: 7.4.3 or 7.4.3-beta1"
exit 1
- uses: actions/checkout@v3
- name: Set intermedia variables
id: intermedia
run: |
echo "::set-output name=short_ref::${GITHUB_REF#refs/*/}"
echo "::set-output name=check_passed::false"
echo "::set-output name=branch_name::v${{steps.regex-match.outputs.group1}}"
echo "::set-output name=branch_exist::$(git ls-remote --heads https://github.com/grafana/grafana.git v${{ steps.regex-match.outputs.group1 }}.x | wc -l)"
- name: Check input version is aligned with branch(main)
if: ${{ github.event.inputs.version != '' && steps.intermedia.outputs.branch_exist == '0' && !contains(steps.intermedia.outputs.short_ref, 'main') }}
run: |
echo "When you want to deliver a new new minor version, you might want to create a new branch first \
with naming convention v[major].[minor].x, and just run the workflow on that branch. \
Run the workflow on main only when needed"
exit 1
- name: Checkout Actions
uses: actions/checkout@v3
with:
repository: "grafana/grafana-github-actions"
path: ./actions
ref: main
- uses: actions/setup-node@v3.2.0
with:
node-version: '16'
- name: Install Actions
run: npm install --production --prefix ./actions
- name: Run bump version (manually invoked)
if: ${{ github.event.inputs.version != '' }}
uses: ./actions/bump-version
with:
token: ${{ secrets.GH_BOT_ACCESS_TOKEN }}
metricsWriteAPIKey: ${{ secrets.GRAFANA_MISC_STATS_API_KEY }}
- name: Run bump version (workflow invoked)
if: ${{ inputs.version_call != '' }}
uses: ./actions/bump-version
with:
version_call: ${{ inputs.version_call }}
token: ${{ secrets.token }}
metricsWriteAPIKey: ${{ secrets.metricsWriteAPIKey }}

View File

@@ -1,39 +0,0 @@
name: Close milestone
on:
workflow_dispatch:
inputs:
version:
required: true
description: Needs to match, exactly, the name of a milestone
workflow_call:
inputs:
version_call:
description: Needs to match, exactly, the name of a milestone
required: true
type: string
secrets:
token:
required: true
jobs:
main:
runs-on: ubuntu-latest
steps:
- name: Checkout Actions
uses: actions/checkout@v3
with:
repository: "grafana/grafana-github-actions"
path: ./actions
ref: main
- name: Install Actions
run: npm install --production --prefix ./actions
- name: Close milestone (manually invoked)
if: ${{ github.event.inputs.version != '' }}
uses: ./actions/close-milestone
with:
token: ${{ secrets.GH_BOT_ACCESS_TOKEN }}
- name: Close milestone (workflow invoked)
if: ${{ inputs.version_call != '' }}
uses: ./actions/close-milestone
with:
version_call: ${{ inputs.version_call }}
token: ${{ secrets.token }}

View File

@@ -1,17 +0,0 @@
name: Cloud data sources test code coverage
on:
pull_request:
paths:
- 'pkg/tsdb/azuremonitor/**'
- 'pkg/tsdb/cloudwatch/**'
- 'pkg/tsdb/cloudmonitoring/**'
- 'public/app/plugins/datasource/grafana-azure-monitor-datasource/**'
- 'public/app/plugins/datasource/cloudwatch/**'
- 'public/app/plugins/datasource/cloud-monitoring/**'
branches-ignore:
- dependabot/**
- backport-*
jobs:
workflow-call:
uses: grafana/code-coverage/.github/workflows/code-coverage.yml@v0.1.2

View File

@@ -1,53 +0,0 @@
# For most projects, this workflow file will not need changing; you simply need
# to commit it to your repository.
#
# You may wish to alter this file to override the set of languages analyzed,
# or to provide custom queries or build logic.
name: "CodeQL"
on:
push:
branches: [main, v1.8.x, v2.0.x, v2.1.x, v2.6.x, v3.0.x, v3.1.x, v4.0.x, v4.1.x, v4.2.x, v4.3.x, v4.4.x, v4.5.x, v4.6.x, v4.7.x, v5.0.x, v5.1.x, v5.2.x, v5.3.x, v5.4.x, v6.0.x, v6.1.x, v6.2.x, v6.3.x, v6.4.x, v6.5.x, v6.6.x, v6.7.x, v7.0.x, v7.1.x, v7.2.x]
paths-ignore:
- '**/*.cue'
- '**/*.json'
- '**/*.md'
- '**/*.txt'
- '**/*.yml'
schedule:
- cron: '0 4 * * 6'
jobs:
analyze:
name: Analyze
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
# Override automatic language detection by changing the below list
# Supported options are ['csharp', 'cpp', 'go', 'java', 'javascript', 'python']
language: ['javascript', 'go', 'python']
# Learn more...
# https://docs.github.com/en/github/finding-security-vulnerabilities-and-errors-in-your-code/configuring-code-scanning#overriding-automatic-language-detection
steps:
- name: Checkout repository
uses: actions/checkout@v3
with:
# We must fetch at least the immediate parents so that if this is
# a pull request then we can checkout the head.
fetch-depth: 2
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@v1
with:
languages: ${{ matrix.language }}
# If you wish to specify custom queries, you can do so here or in a config file.
# By default, queries listed here will override any specified in a config file.
# Prefix the list here with "+" to use these queries and those in the config file.
# queries: ./path/to/local/query, your-org/your-repo/queries@main
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v1

View File

@@ -1,26 +0,0 @@
name: Run commands when issues are labeled or comments added
on:
issues:
types: [labeled, unlabeled]
issue_comment:
types: [created]
concurrency:
group: issue-commands-${{ github.event.issue.number }}
jobs:
main:
runs-on: ubuntu-latest
steps:
- name: Checkout Actions
uses: actions/checkout@v3
with:
repository: "grafana/grafana-github-actions"
path: ./actions
ref: main
- name: Install Actions
run: npm install --production --prefix ./actions
- name: Run Commands
uses: ./actions/commands
with:
metricsWriteAPIKey: ${{secrets.GRAFANA_MISC_STATS_API_KEY}}
token: ${{secrets.GH_BOT_ACCESS_TOKEN}}
configPath: commands

View File

@@ -1,141 +0,0 @@
name: Levitate / Detect breaking changes
on: pull_request
jobs:
buildPR:
name: Build PR
runs-on: ubuntu-latest
defaults:
run:
working-directory: './pr'
steps:
- uses: actions/checkout@v3
with:
path: './pr'
- name: Get yarn cache directory path
id: yarn-cache-dir-path
run: echo "::set-output name=dir::$(yarn config get cacheFolder)"
- name: Restore yarn cache
uses: actions/cache@v2
id: yarn-cache
with:
path: ${{ steps.yarn-cache-dir-path.outputs.dir }}
key: yarn-cache-folder-${{ hashFiles('**/yarn.lock', '.yarnrc.yml') }}
restore-keys: |
yarn-cache-folder-
- name: Install dependencies
run: yarn install --immutable
- name: Build packages
run: yarn packages:build
- name: Zip built packages
run: zip -r ./pr_built_packages.zip ./packages/**/dist
- name: Upload build output as artifact
uses: actions/upload-artifact@v2
with:
name: buildPr
path: './pr/pr_built_packages.zip'
buildBase:
name: Build Base
runs-on: ubuntu-latest
defaults:
run:
working-directory: './base'
steps:
- uses: actions/checkout@v3
with:
path: './base'
ref: ${{ github.event.pull_request.base.ref }}
- name: Get yarn cache directory path
id: yarn-cache-dir-path
run: echo "::set-output name=dir::$(yarn config get cacheFolder)"
- name: Restore yarn cache
uses: actions/cache@v2
id: yarn-cache
with:
path: ${{ steps.yarn-cache-dir-path.outputs.dir }}
key: yarn-cache-folder-${{ hashFiles('**/yarn.lock', '.yarnrc.yml') }}
restore-keys: |
yarn-cache-folder-
- name: Install dependencies
run: yarn install --immutable
- name: Build packages
run: yarn packages:build
- name: Zip built packages
run: zip -r ./base_built_packages.zip ./packages/**/dist
- name: Upload build output as artifact
uses: actions/upload-artifact@v2
with:
name: buildBase
path: './base/base_built_packages.zip'
Detect:
name: Detect breaking changes
runs-on: ubuntu-latest
needs: ['buildPR', 'buildBase']
env:
GITHUB_STEP_NUMBER: 7
steps:
- uses: actions/checkout@v3
- name: Get built packages from pr
uses: actions/download-artifact@v3
with:
name: buildPr
- name: Get built packages from base
uses: actions/download-artifact@v3
with:
name: buildBase
- name: Unzip artifact from pr
run: unzip pr_built_packages.zip -d ./pr && rm pr_built_packages.zip
- name: Unzip artifact from base
run: unzip base_built_packages.zip -d ./base && rm base_built_packages.zip
- name: Get link for the Github Action job
id: job
uses: actions/github-script@v6
with:
script: |
const script = require('./.github/workflows/scripts/pr-get-job-link.js')
await script({github, context, core})
- name: Detect breaking changes
id: breaking-changes
run: ./scripts/check-breaking-changes.sh
env:
FORCE_COLOR: 3
GITHUB_JOB_LINK: ${{ steps.job.outputs.link }}
- name: Persisting the check output
run: |
mkdir -p ./levitate
echo "{ \"exit_code\": ${{ steps.breaking-changes.outputs.is_breaking }}, \"message\": \"${{ steps.breaking-changes.outputs.message }}\", \"job_link\": \"${{ steps.job.outputs.link }}#step:${GITHUB_STEP_NUMBER}:1\", \"pr_number\": \"${{ github.event.pull_request.number }}\" }" > ./levitate/result.json
- name: Upload check output as artifact
uses: actions/upload-artifact@v2
with:
name: levitate
path: levitate/
- name: Exit
run: exit ${{ steps.breaking-changes.outputs.is_breaking }}
shell: bash

View File

@@ -1,184 +0,0 @@
name: Levitate / Report breaking changes
on:
workflow_run:
workflows: ["Levitate / Detect breaking changes"]
types: [completed]
jobs:
notify:
name: Report
runs-on: ubuntu-latest
env:
ARTIFACT_FOLDER: '${{ github.workspace }}/tmp'
ARTIFACT_NAME: 'levitate'
steps:
- uses: actions/checkout@v3
- name: 'Download artifact'
uses: actions/github-script@v6
env:
RUN_ID: ${{ github.event.workflow_run.id }}
with:
script: |
const fs = require('fs');
const { owner, repo } = context.repo;
const runId = process.env.RUN_ID;
const artifactName = process.env.ARTIFACT_NAME;
const artifactFolder = process.env.ARTIFACT_FOLDER;
const artifacts = await github.rest.actions.listWorkflowRunArtifacts({
owner,
repo,
run_id: runId,
});
const artifact = artifacts.data.artifacts.find(a => a.name === artifactName);
if (!artifact) {
throw new Error(`Could not find artifact ${ artifactName } in workflow (${ runId })`);
}
const download = await github.rest.actions.downloadArtifact({
owner,
repo,
artifact_id: artifact.id,
archive_format: 'zip',
});
fs.mkdirSync(artifactFolder, { recursive: true });
fs.writeFileSync(`${ artifactFolder }/${ artifactName }.zip`, Buffer.from(download.data));
- name: Unzip artifact
run: unzip "${ARTIFACT_FOLDER}/${ARTIFACT_NAME}.zip" -d "${ARTIFACT_FOLDER}"
- name: Parsing levitate result
uses: actions/github-script@v6
id: levitate-run
with:
script: |
const filePath = `${ process.env.ARTIFACT_FOLDER }/result.json`;
const script = require('./.github/workflows/scripts/json-file-to-job-output.js');
await script({ core, filePath });
- name: Check if "levitate breaking change" label exists
id: does-label-exist
uses: actions/github-script@v6
env:
PR_NUMBER: ${{ github.event.workflow_run.pull_requests[0].number }}
with:
script: |
const { data } = await github.rest.issues.listLabelsOnIssue({
issue_number: process.env.PR_NUMBER,
owner: context.repo.owner,
repo: context.repo.repo,
});
const labels = data.map(({ name }) => name);
const doesExist = labels.includes('levitate breaking change');
return doesExist ? 1 : 0;
- name: Comment on PR
if: ${{ steps.levitate-run.outputs.exit_code == 1 }}
uses: marocchino/sticky-pull-request-comment@v2
with:
number: ${{ steps.levitate-run.outputs.pr_number }}
message: |
⚠️ &nbsp;&nbsp;**Possible breaking changes**
_(Open the links below in a new tab to go to the correct steps)_
${{ steps.levitate-run.outputs.message }}
[Console output](${{ steps.levitate-run.outputs.job_link }})
[Read our guideline](https://github.com/grafana/grafana/blob/main/contribute/breaking-changes-guide.md)
- name: Remove comment on PR
if: ${{ steps.levitate-run.outputs.exit_code == 0 }}
uses: marocchino/sticky-pull-request-comment@v2
with:
number: ${{ steps.levitate-run.outputs.pr_number }}
delete: true
# Posts a notification to Slack if a PR has a breaking change and it did not have a breaking change before
- name: Post to Slack
id: slack
if: ${{ steps.levitate-run.outputs.exit_code == 1 && steps.does-label-exist.outputs.result == 0 }}
uses: slackapi/slack-github-action@v1.18.0
with:
payload: |
{
"pr_link": "https://github.com/grafana/grafana/pull/${{ steps.levitate-run.outputs.pr_number }}",
"pr_number": "${{ steps.levitate-run.outputs.pr_number }}",
"job_link": "${{ steps.levitate-run.outputs.job_link }}",
"reporting_job_link": "${{ github.event.workflow_run.html_url }}",
"message": "${{ steps.levitate-run.outputs.message }}"
}
env:
SLACK_WEBHOOK_URL: ${{ secrets.SLACK_LEVITATE_WEBHOOK_URL }}
- name: Add "levitate breaking change" label
if: ${{ steps.levitate-run.outputs.exit_code == 1 && steps.does-label-exist.outputs.result == 0 }}
uses: actions/github-script@v6
env:
PR_NUMBER: ${{ steps.levitate-run.outputs.pr_number }}
with:
github-token: ${{ secrets.GH_BOT_ACCESS_TOKEN }}
script: |
await github.rest.issues.addLabels({
issue_number: process.env.PR_NUMBER,
owner: context.repo.owner,
repo: context.repo.repo,
labels: ['levitate breaking change']
})
- name: Remove "levitate breaking change" label
if: ${{ steps.levitate-run.outputs.exit_code == 0 && steps.does-label-exist.outputs.result == 1 }}
uses: actions/github-script@v6
env:
PR_NUMBER: ${{ steps.levitate-run.outputs.pr_number }}
with:
github-token: ${{ secrets.GH_BOT_ACCESS_TOKEN }}
script: |
await github.rest.issues.removeLabel({
issue_number: process.env.PR_NUMBER,
owner: context.repo.owner,
repo: context.repo.repo,
name: 'levitate breaking change'
})
# This is very weird, the actual request goes through (comes back with a 201), but does not assign the team.
# Related issue: https://github.com/renovatebot/renovate/issues/1908
- name: Add "grafana/plugins-platform-frontend" as a reviewer
if: ${{ steps.levitate-run.outputs.exit_code == 1 }}
uses: actions/github-script@v6
env:
PR_NUMBER: ${{ steps.levitate-run.outputs.pr_number }}
with:
github-token: ${{ secrets.GH_BOT_ACCESS_TOKEN }}
script: |
await github.rest.pulls.requestReviewers({
pull_number: process.env.PR_NUMBER,
owner: context.repo.owner,
repo: context.repo.repo,
reviewers: [],
team_reviewers: ['grafana/plugins-platform-frontend']
});
- name: Remove "grafana/plugins-platform-frontend" from the list of reviewers
if: ${{ steps.levitate-run.outputs.exit_code == 0 }}
uses: actions/github-script@v6
env:
PR_NUMBER: ${{ steps.levitate-run.outputs.pr_number }}
with:
github-token: ${{ secrets.GH_BOT_ACCESS_TOKEN }}
script: |
await github.rest.pulls.removeRequestedReviewers({
pull_number: process.env.PR_NUMBER,
owner: context.repo.owner,
repo: context.repo.repo,
reviewers: [],
team_reviewers: ['grafana/plugins-platform-frontend']
});

View File

@@ -1,16 +0,0 @@
name: "doc-validator"
on:
pull_request:
paths: ["docs/sources/**"]
workflow_dispatch:
jobs:
doc-validator:
runs-on: "ubuntu-latest"
container:
image: "grafana/doc-validator:latest"
steps:
- name: "Checkout code"
uses: "actions/checkout@v3"
- name: "Run doc-validator tool"
# Ensure that the CI always passes until all errors are resolved.
run: "doc-validator ./docs/sources || true"

View File

@@ -1,26 +0,0 @@
name: Enterprise PR check
on:
pull_request:
branches:
- main
- 'v[0-9]+.[0-9]+.x'
jobs:
dispatch:
runs-on: ubuntu-latest
steps:
- name: Checkout Actions
uses: actions/checkout@v2
with:
repository: "grafana/grafana-github-actions"
path: ./actions
ref: main
- name: Install Actions
run: npm install --production --prefix ./actions
- name: Repository Dispatch
uses: ./actions/repository-dispatch
with:
token: ${{ secrets.GH_BOT_ACCESS_TOKEN }}
repository: grafana/grafana-enterprise
event_type: oss-pull-request
client_payload:
'{"source_branch": "${{ github.head_ref }}", "target_branch": "${{ github.base_ref }}", "pr_number": "${{ github.event.number }}"}'

View File

@@ -1,24 +0,0 @@
name: Create or update GitHub release
on:
workflow_dispatch:
inputs:
version:
required: true
description: Needs to match, exactly, the name of a milestone (NO v prefix)
jobs:
main:
runs-on: ubuntu-latest
steps:
- name: Checkout Actions
uses: actions/checkout@v3
with:
repository: "grafana/grafana-github-actions"
path: ./actions
ref: main
- name: Install Actions
run: npm install --production --prefix ./actions
- name: Run github release action
uses: ./actions/github-release
with:
token: ${{secrets.GH_BOT_ACCESS_TOKEN}}
metricsWriteAPIKey: ${{secrets.GRAFANA_MISC_STATS_API_KEY}}

View File

@@ -1,35 +0,0 @@
#
# When triggered by the cron job it will also collect metrics for:
# * number of issues without label
# * number of issues with "needs more info"
# * number of issues with "needs investigation"
# * number of issues with label type/bug
# * number of open issues in current milestone
#
# https://github.com/grafana/grafana-github-actions/blob/main/metrics-collector/index.ts
#
name: Github issue metrics collection
on:
schedule:
- cron: "*/10 * * * *"
issues:
types: [opened, closed]
jobs:
main:
runs-on: ubuntu-latest
steps:
- name: Checkout Actions
uses: actions/checkout@v3
with:
repository: "grafana/grafana-github-actions"
path: ./actions
ref: main
- name: Install Actions
run: npm install --production --prefix ./actions
- name: Run metrics collector
uses: ./actions/metrics-collector
with:
metricsWriteAPIKey: ${{secrets.GRAFANA_MISC_STATS_API_KEY}}
token: ${{secrets.GH_BOT_ACCESS_TOKEN}}
configPath: "metrics-collector"

View File

@@ -1,21 +0,0 @@
name: Close Milestone
on:
workflow_dispatch:
inputs:
version_input:
description: 'The version to be released please respect: major.minor.patch or major.minor.patch-beta<number> format. example: 7.4.3 or 7.4.3-beta1'
required: true
jobs:
call-remove-milestone:
uses: grafana/grafana/.github/workflows/remove-milestone.yml@main
with:
version_call: ${{ github.event.inputs.version_input }}
secrets:
token: ${{ secrets.GH_BOT_ACCESS_TOKEN }}
call-close-milestone:
uses: grafana/grafana/.github/workflows/close-milestone.yml@main
with:
version_call: ${{ github.event.inputs.version_input }}
secrets:
token: ${{ secrets.GH_BOT_ACCESS_TOKEN }}
needs: call-remove-milestone

View File

@@ -1,35 +0,0 @@
name: PR Checks
on:
pull_request_target:
types:
- opened
- reopened
- synchronize
- ready_for_review
- labeled
- unlabeled
- edited
issues:
types:
- milestoned
- demilestoned
concurrency:
group: pr-checks-${{ github.event.number }}
jobs:
main:
runs-on: ubuntu-latest
if: github.event.pull_request.draft == false
steps:
- name: Checkout Actions
uses: actions/checkout@v3
with:
repository: "grafana/grafana-github-actions"
path: ./actions
ref: main
- name: Install Actions
run: npm install --production --prefix ./actions
- name: Run PR Checks
uses: ./actions/pr-checks
with:
token: ${{secrets.GITHUB_TOKEN}}
configPath: pr-checks

View File

@@ -1,29 +0,0 @@
name: "CodeQL for PR / go"
on:
pull_request:
branches: [main]
paths:
- '**/*.go'
jobs:
analyze:
name: Analyze
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v3
with:
# We must fetch at least the immediate parents so that if this is
# a pull request then we can checkout the head.
fetch-depth: 2
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@v1
with:
languages: "go"
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v1

View File

@@ -1,31 +0,0 @@
name: "CodeQL for PR / javascript"
on:
pull_request:
branches: [main]
paths:
- '**/*.js'
- '**/*.ts'
- '**/*.tsx'
jobs:
analyze:
name: Analyze
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v3
with:
# We must fetch at least the immediate parents so that if this is
# a pull request then we can checkout the head.
fetch-depth: 2
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@v1
with:
languages: "javascript"
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v1

View File

@@ -1,29 +0,0 @@
name: "CodeQL for PR / python"
on:
pull_request:
branches: [main]
paths:
- '**/*.py'
jobs:
analyze:
name: Analyze
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v3
with:
# We must fetch at least the immediate parents so that if this is
# a pull request then we can checkout the head.
fetch-depth: 2
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@v1
with:
languages: "python"
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v1

View File

@@ -1,18 +0,0 @@
name: Run when PRs are closed
on:
pull_request:
types:
- closed
concurrency:
group: pr-commands-closed-${{ github.event.number }}
jobs:
close_job:
# this job will only run if the PR has been closed without being merged
if: github.event.pull_request.merged == false
runs-on: ubuntu-latest
steps:
- run: |
echo PR #${{ github.event.number }} has been closed without being merged, removing milestone.
gh pr edit ${{ github.event.number }} --milestone "" --repo $GITHUB_REPOSITORY
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}

View File

@@ -1,26 +0,0 @@
name: PR automation
on:
pull_request_target:
types:
- opened
- synchronize
concurrency:
group: pr-commands-${{ github.event.number }}
jobs:
main:
runs-on: ubuntu-latest
steps:
- name: Checkout Actions
uses: actions/checkout@v3
with:
repository: "grafana/grafana-github-actions"
path: ./actions
ref: main
- name: Install Actions
run: npm install --production --prefix ./actions
- name: Run Commands
uses: ./actions/commands
with:
metricsWriteAPIKey: ${{secrets.GRAFANA_MISC_STATS_API_KEY}}
token: ${{secrets.GH_BOT_ACCESS_TOKEN}}
configPath: pr-commands

View File

@@ -1,23 +0,0 @@
name: Prepare release
on:
workflow_dispatch:
inputs:
version_input:
description: 'The version to be released please respect: major.minor.patch or major.minor.patch-beta<number> format. example: 7.4.3 or 7.4.3-beta1'
required: true
jobs:
call-bump-version:
uses: grafana/grafana/.github/workflows/bump-version.yml@main
with:
version_call: ${{ github.event.inputs.version_input }}
secrets:
token: ${{ secrets.GH_BOT_ACCESS_TOKEN }}
metricsWriteAPIKey: ${{ secrets.GRAFANA_MISC_STATS_API_KEY }}
call-update-changelog:
uses: grafana/grafana/.github/workflows/update-changelog.yml@main
with:
version_call: ${{ github.event.inputs.version_input }}
secrets:
token: ${{ secrets.GH_BOT_ACCESS_TOKEN }}
metricsWriteAPIKey: ${{ secrets.GRAFANA_MISC_STATS_API_KEY }}
needs: call-bump-version

View File

@@ -3,10 +3,9 @@ name: publish_docs
on:
push:
branches:
- v9.0.x
- master
paths:
- 'docs/sources/**'
- 'packages/grafana-*/**'
jobs:
build:
@@ -14,26 +13,9 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- run: git clone --single-branch --no-tags --depth 1 -b master https://grafanabot:${{ secrets.GH_BOT_ACCESS_TOKEN }}@github.com/grafana/website-sync ./.github/actions/website-sync
- name: generate-packages-docs
uses: actions/setup-node@v3.2.0
id: generate-docs
with:
node-version: '16'
- name: Get yarn cache directory path
id: yarn-cache-dir-path
run: echo "::set-output name=dir::$(yarn config get cacheFolder)"
- uses: actions/cache@v2.1.7
with:
path: ${{ steps.yarn-cache-dir-path.outputs.dir }}
key: yarn-${{ hashFiles('**/yarn.lock') }}
restore-keys: |
yarn-
- run: yarn install --immutable
- run: ./scripts/ci-reference-docs-build.sh
- uses: actions/checkout@v1
- name: publish-to-git
uses: ./.github/actions/website-sync
uses: ./.github/actions/gha-publish-to-git
id: publish
with:
repository: grafana/website
@@ -42,8 +24,7 @@ jobs:
github_pat: '${{ secrets.GH_BOT_ACCESS_TOKEN }}'
source_folder: docs/sources
target_folder: content/docs/grafana/latest
allow_no_changes: 'true'
- shell: bash
run: |
test -n "${{ steps.publish.outputs.commit_hash }}"
test -n "${{ steps.publish.outputs.working_directory }}"
test -n "${{ steps.publish.outputs.working_directory }}"

View File

@@ -1,39 +0,0 @@
name: Remove milestone
on:
workflow_dispatch:
inputs:
version:
required: true
description: Needs to match, exactly, the name of a milestone
workflow_call:
inputs:
version_call:
description: Needs to match, exactly, the name of a milestone
required: true
type: string
secrets:
token:
required: true
jobs:
main:
runs-on: ubuntu-latest
steps:
- name: Checkout Actions
uses: actions/checkout@v3
with:
repository: "grafana/grafana-github-actions"
path: ./actions
ref: main
- name: Install Actions
run: npm install --production --prefix ./actions
- name: Remove milestone from open issues (manually invoked)
if: ${{ github.event.inputs.version != '' }}
uses: ./actions/remove-milestone
with:
token: ${{ secrets.GH_BOT_ACCESS_TOKEN }}
- name: Remove milestone from open issues (workflow invoked)
if: ${{ inputs.version_call != '' }}
uses: ./actions/remove-milestone
with:
version_call: ${{ inputs.version_call }}
token: ${{ secrets.token }}

View File

@@ -1,27 +0,0 @@
module.exports = async ({ core, filePath }) => {
try {
const fs = require('fs');
const content = await readFile(fs, filePath);
const result = JSON.parse(content);
core.startGroup('Parsing json file...');
for (const property in result) {
core.info(`${property} <- ${result[property]}`);
core.setOutput(property, result[property]);
}
core.endGroup();
} catch (error) {
core.restFailed(error.message);
}
}
async function readFile(fs, path) {
return new Promise((resolve, reject) => {
fs.readFile(path, (error, data) => {
if (error) return reject(error);
return resolve(data);
});
});
}

View File

@@ -1,9 +0,0 @@
module.exports = async ({ github, context, core }) => {
const { owner, repo } = context.repo;
const url = `https://api.github.com/repos/${owner}/${repo}/actions/runs/${context.runId}/jobs`
const result = await github.request(url)
const link = `https://github.com/grafana/grafana/runs/${result.data.jobs[0].id}?check_suite_focus=true`;
core.setOutput('link', link);
}

View File

@@ -1,33 +0,0 @@
name: 'Close stale issues and PRs'
on:
schedule:
- cron: '30 1 * * *'
jobs:
stale:
runs-on: ubuntu-latest
steps:
- uses: actions/stale@v5
with:
repo-token: ${{ secrets.GH_BOT_ACCESS_TOKEN }}
# Number of days of inactivity before a stale Issue or Pull Request is closed.
# Set to -1 to disable. If disabled, issues still need to be closed manually, but will remain marked as stale.
days-before-close: 14
# Number of days of inactivity before an Issue or Pull Request becomes stale
days-before-stale: 30
# We don't want any Issues to be marked as stale for now.
days-before-issue-stale: -1
exempt-issue-labels: no stalebot
exempt-pr-labels: no stalebot
operations-per-run: 500
stale-issue-label: stale
stale-pr-label: stale
stale-pr-message: >
This pull request has been automatically marked as stale because it has not had
activity in the last 30 days. It will be closed in 2 weeks if no further activity occurs. Please
feel free to give a status update now, ping for review, or re-open when it's ready.
Thank you for your contributions!
close-pr-message: >
This pull request has been automatically closed because it has not had
activity in the last 2 weeks. Please feel free to give a status update now, ping for review, or re-open when it's ready.
Thank you for your contributions!

View File

@@ -1,43 +0,0 @@
name: Update changelog
on:
workflow_dispatch:
inputs:
version:
required: true
description: Needs to match, exactly, the name of a milestone
workflow_call:
inputs:
version_call:
description: Needs to match, exactly, the name of a milestone
required: true
type: string
secrets:
token:
required: true
metricsWriteAPIKey:
required: true
jobs:
main:
runs-on: ubuntu-latest
steps:
- name: Checkout Actions
uses: actions/checkout@v3
with:
repository: "grafana/grafana-github-actions"
path: ./actions
ref: main
- name: Install Actions
run: npm install --production --prefix ./actions
- name: Run update changelog (manually invoked)
if: ${{ github.event.inputs.version != '' }}
uses: ./actions/update-changelog
with:
token: ${{ secrets.GH_BOT_ACCESS_TOKEN }}
metricsWriteAPIKey: ${{ secrets.GRAFANA_MISC_STATS_API_KEY }}
- name: Run update changelog (workflow invoked)
if: ${{ inputs.version_call != '' }}
uses: ./actions/update-changelog
with:
version_call: ${{ inputs.version_call }}
token: ${{ secrets.token }}
metricsWriteAPIKey: ${{ secrets.metricsWriteAPIKey }}

70
.gitignore vendored
View File

@@ -11,36 +11,13 @@ awsconfig
/public/views/error.html
/emails/dist
/reports
/e2e/tmp
/scripts/grafana-server/tmp
vendor/
/docs/menu.yaml
/requests
tsconfig.tsbuildinfo
# Yarn
.yarn/*
!.yarn/patches
!.yarn/releases
!.yarn/plugins
!.yarn/sdks
!.yarn/versions
# we temporarily commit this file because yarn downloading it
# somehow produces different checksum values
!.yarn/cache/pa11y-ci-https-1e9675e9e1-668c9119bd.zip
.pnp.*
.yarnrc
.yarn/
# Enterprise emails
/emails/templates/enterprise_*
/public/emails/enterprise_*
# Enterprise reporting fonts
/public/fonts/dejavu
# Enterprise devenv
/devenv/docker/blocks/grafana-enterprise
/devenv/docker/blocks/saml-enterprise
/tmp
tools/phantomjs/phantomjs
tools/phantomjs/phantomjs.exe
@@ -63,10 +40,7 @@ public/css/*.min.css
*.tmp
.DS_Store
.vscode/
!.vscode/launch.json
.vs/
.eslintcache
.stylelintcache
/data/*
/bin/*
@@ -89,8 +63,6 @@ profile.cov
/pkg/cmd/grafana-server/grafana-server
/pkg/cmd/grafana-server/debug
/pkg/extensions/*
/pkg/server/wireexts_enterprise.go
/pkg/cmd/grafana-cli/runner/wireexts_enterprise.go
!/pkg/extensions/main.go
/public/app/extensions
debug.test
@@ -98,7 +70,6 @@ debug.test
/packaging/**/*.rpm
/packaging/**/*.deb
/packaging/**/*.tar.gz
/packaging/**/*.tar.gz.sha256
# Ignore OSX indexing
.DS_Store
@@ -112,8 +83,6 @@ debug.test
/devenv/bulk-dashboards/*.json
/devenv/bulk_alerting_dashboards/*.json
/devenv/datasources_bulk.yaml
/devenv/bulk_alerting_dashboards/bulk_alerting_datasources.yaml
/scripts/build/release_publisher/release_publisher
*.patch
@@ -134,35 +103,6 @@ compilation-stats.json
/packages/grafana-e2e/cypress/screenshots
/packages/grafana-e2e/cypress/videos
/packages/grafana-e2e/cypress/logs
/e2e/server.log
/e2e/**/screenshots
!/e2e/**/screenshots/expected/*
/e2e/**/videos/*
/e2e/benchmarks/**/results/*
/e2e/benchmarks/**/results
/e2e/build_results.zip
# grafana server
/scripts/grafana-server/server.log
# a11y tests
/pa11y-ci-results.json
/pa11y-ci-report
# report dumping the whole system env
/report.*.json
# auto generated frontend docs
/docs/sources/packages_api
# auto generated Go files
*_gen.go
!pkg/services/featuremgmt/toggles_gen.go
# Auto-generated localisation files
public/locales/_build/
public/locales/**/*.js
deployment_tools_config.json
.betterer.cache
/public/e2e-test/screenShots/theOutput
/public/e2e-tests/screenShots/theOutput/*.png
/public/e2e-tests/videos

View File

@@ -1,8 +0,0 @@
#!/bin/sh
# Ignore husky hooks if no frontend code has been changed
git diff --cached --name-only | grep -v --quiet "^pkg/" || exit 0
. "$(dirname "$0")/_/husky.sh"
yarn run precommit

View File

@@ -1,32 +0,0 @@
{
"locales": [
"en",
"fr",
"es",
"pseudo-LOCALE"
],
"catalogs": [
{
"path": "public/locales/{locale}/messages",
"include": [
"public/app"
],
"exclude": [
"**/*.d.ts",
"**/*.test.ts",
"**/node_modules/**",
"public/app/plugins"
]
}
],
"fallbackLocales": {
"pseudo-LOCALE": "en",
"default": "en"
},
"pseudoLocale": "pseudo-LOCALE",
"sourceLocale": "en",
"format": "po",
"formatOptions": {
"lineNumbers": false
}
}

1
.nvmrc
View File

@@ -1 +0,0 @@
v16.14.0

View File

@@ -1,123 +0,0 @@
var config = {
defaults: {
concurrency: 1,
runners: ['axe'],
useIncognitoBrowserContext: false,
chromeLaunchConfig: {
args: ['--no-sandbox'],
},
// see https://github.com/grafana/grafana/pull/41693#issuecomment-979921463 for context
// on why we're ignoring singleValue/react-select-*-placeholder elements
hideElements: '#updateVersion, [class*="-singleValue"], [id^="react-select-"][id$="-placeholder"]',
},
urls: [
{
url: '${HOST}/login',
wait: 500,
rootElement: '.main-view',
threshold: 12,
},
{
url: '${HOST}/login',
wait: 500,
actions: [
"wait for element input[name='user'] to be added",
"set field input[name='user'] to admin",
"set field input[name='password'] to admin",
"click element button[aria-label='Login button']",
"wait for element [aria-label='Skip change password button'] to be visible",
],
threshold: 13,
rootElement: '.main-view',
},
{
url: '${HOST}/?orgId=1',
wait: 500,
threshold: 0,
},
{
url: '${HOST}/d/O6f11TZWk/panel-tests-bar-gauge',
wait: 500,
rootElement: '.main-view',
threshold: 0,
},
{
url: '${HOST}/d/O6f11TZWk/panel-tests-bar-gauge?orgId=1&editview=settings',
wait: 500,
rootElement: '.main-view',
threshold: 0,
},
{
url: '${HOST}/?orgId=1&search=open',
wait: 500,
rootElement: '.main-view',
threshold: 0,
},
{
url: '${HOST}/alerting/list',
wait: 500,
rootElement: '.main-view',
// the unified alerting promotion alert's content contrast is too low
// see https://github.com/grafana/grafana/pull/41829
threshold: 5,
},
{
url: '${HOST}/datasources',
wait: 500,
rootElement: '.main-view',
threshold: 0,
},
{
url: '${HOST}/org/users',
wait: 500,
rootElement: '.main-view',
threshold: 0,
},
{
url: '${HOST}/org/teams',
wait: 500,
rootElement: '.main-view',
threshold: 0,
},
{
url: '${HOST}/plugins',
wait: 500,
rootElement: '.main-view',
threshold: 0,
},
{
url: '${HOST}/org',
wait: 500,
rootElement: '.main-view',
threshold: 0,
},
{
url: '${HOST}/org/apikeys',
wait: 500,
rootElement: '.main-view',
threshold: 0,
},
{
url: '${HOST}/dashboards',
wait: 500,
rootElement: '.main-view',
threshold: 0,
},
],
};
function myPa11yCiConfiguration(urls, defaults) {
const HOST_SERVER = process.env.HOST || 'localhost';
const PORT_SERVER = process.env.PORT || '3001';
for (var idx = 0; idx < urls.length; idx++) {
urls[idx] = { ...urls[idx], url: urls[idx].url.replace('${HOST}', `${HOST_SERVER}:${PORT_SERVER}`) };
}
return {
defaults: defaults,
urls: urls,
};
}
module.exports = myPa11yCiConfiguration(config.urls, config.defaults);

View File

@@ -1,106 +0,0 @@
var config = {
defaults: {
concurrency: 1,
runners: ['axe'],
useIncognitoBrowserContext: false,
chromeLaunchConfig: {
args: ['--no-sandbox'],
},
// see https://github.com/grafana/grafana/pull/41693#issuecomment-979921463 for context
// on why we're ignoring singleValue/react-select-*-placeholder elements
hideElements: '#updateVersion, [class*="-singleValue"], [id^="react-select-"][id$="-placeholder"]',
},
urls: [
{
url: '${HOST}/login',
wait: 500,
rootElement: '.main-view',
},
{
url: '${HOST}/login', //skip password and login
actions: [
"wait for element input[name='user'] to be added",
"set field input[name='user'] to admin",
"set field input[name='password'] to admin",
"click element button[aria-label='Login button']",
"wait for element [aria-label='Skip change password button'] to be visible",
],
wait: 500,
rootElement: '.main-view',
},
{
url: '${HOST}/?orgId=1',
wait: 500,
},
{
url: '${HOST}/d/O6f11TZWk/panel-tests-bar-gauge',
wait: 500,
rootElement: '.main-view',
},
{
url: '${HOST}/d/O6f11TZWk/panel-tests-bar-gauge?orgId=1&editview=settings',
wait: 500,
rootElement: '.main-view',
},
{
url: '${HOST}/?orgId=1&search=open',
wait: 500,
rootElement: '.main-view',
},
{
url: '${HOST}/alerting/list',
wait: 500,
rootElement: '.main-view',
},
{
url: '${HOST}/datasources',
wait: 500,
rootElement: '.main-view',
},
{
url: '${HOST}/org/users',
wait: 500,
rootElement: '.main-view',
},
{
url: '${HOST}/org/teams',
wait: 500,
rootElement: '.main-view',
},
{
url: '${HOST}/plugins',
wait: 500,
rootElement: '.main-view',
},
{
url: '${HOST}/org',
wait: 500,
rootElement: '.main-view',
},
{
url: '${HOST}/org/apikeys',
wait: 500,
rootElement: '.main-view',
},
{
url: '${HOST}/dashboards',
wait: 500,
rootElement: '.main-view',
},
],
};
function myPa11yCiConfiguration(urls, defaults) {
const HOST_SERVER = process.env.HOST || 'localhost';
const PORT_SERVER = process.env.PORT || '3001';
for (var idx = 0; idx < urls.length; idx++) {
urls[idx] = { ...urls[idx], url: urls[idx].url.replace('${HOST}', `${HOST_SERVER}:${PORT_SERVER}`) };
}
return {
defaults: defaults,
urls: urls,
};
}
module.exports = myPa11yCiConfiguration(config.urls, config.defaults);

View File

@@ -1,27 +1,9 @@
.git
.github
.yarn
build
compiled
data
deployment_tools_config.json
devenv
dist
e2e/tmp
dist/
pkg/
node_modules
pkg
public/lib/monaco
public/sass/*.generated.scss
scripts/grafana-server/tmp
vendor
public/vendor/
vendor/
data/
# TS generate from cue by cuetsy
**/*.gen.ts
# Auto-generated localisation files
public/locales/_build/
public/locales/**/*.js
# Auto-generated theme files
theme.light.generated.json
theme.dark.generated.json

View File

@@ -1,3 +0,0 @@
module.exports = {
...require('@grafana/toolkit/src/config/prettier.plugin.config.json'),
};

View File

@@ -1,6 +0,0 @@
{
"eslint.packageManager": "yarn",
"eslint.nodePath": ".yarn/sdks",
"workspace.workspaceFolderCheckCwd": false,
"tsserver.tsdk": ".yarn/sdks/typescript/lib"
}

25
.vscode/launch.json vendored
View File

@@ -1,25 +0,0 @@
{
"version": "0.2.0",
"configurations": [
{
"name": "Run Server",
"type": "go",
"request": "launch",
"mode": "auto",
"program": "${workspaceFolder}/pkg/cmd/grafana-server/",
"env": {},
"cwd": "${workspaceFolder}",
"args": ["--homepath", "${workspaceFolder}", "--packaging", "dev"]
},
{
"name": "Debug Jest test",
"type": "node",
"request": "launch",
"runtimeExecutable": "yarn",
"runtimeArgs": ["run", "jest", "--runInBand", "${file}"],
"console": "integratedTerminal",
"internalConsoleOptions": "neverOpen",
"port": 9229
}
]
}

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@@ -1,20 +0,0 @@
#!/usr/bin/env node
const {existsSync} = require(`fs`);
const {createRequire, createRequireFromPath} = require(`module`);
const {resolve} = require(`path`);
const relPnpApiPath = "../../../../.pnp.cjs";
const absPnpApiPath = resolve(__dirname, relPnpApiPath);
const absRequire = (createRequire || createRequireFromPath)(absPnpApiPath);
if (existsSync(absPnpApiPath)) {
if (!process.versions.pnp) {
// Setup the environment to be able to require eslint/bin/eslint.js
require(absPnpApiPath).setup();
}
}
// Defer to the real eslint/bin/eslint.js your application uses
module.exports = absRequire(`eslint/bin/eslint.js`);

View File

@@ -1,20 +0,0 @@
#!/usr/bin/env node
const {existsSync} = require(`fs`);
const {createRequire, createRequireFromPath} = require(`module`);
const {resolve} = require(`path`);
const relPnpApiPath = "../../../../.pnp.cjs";
const absPnpApiPath = resolve(__dirname, relPnpApiPath);
const absRequire = (createRequire || createRequireFromPath)(absPnpApiPath);
if (existsSync(absPnpApiPath)) {
if (!process.versions.pnp) {
// Setup the environment to be able to require eslint
require(absPnpApiPath).setup();
}
}
// Defer to the real eslint your application uses
module.exports = absRequire(`eslint`);

View File

@@ -1,6 +0,0 @@
{
"name": "eslint",
"version": "8.11.0-sdk",
"main": "./lib/api.js",
"type": "commonjs"
}

View File

@@ -1,6 +0,0 @@
# This file is automatically generated by @yarnpkg/sdks.
# Manual changes might be lost!
integrations:
- vscode
- vim

View File

@@ -1,20 +0,0 @@
#!/usr/bin/env node
const {existsSync} = require(`fs`);
const {createRequire, createRequireFromPath} = require(`module`);
const {resolve} = require(`path`);
const relPnpApiPath = "../../../.pnp.cjs";
const absPnpApiPath = resolve(__dirname, relPnpApiPath);
const absRequire = (createRequire || createRequireFromPath)(absPnpApiPath);
if (existsSync(absPnpApiPath)) {
if (!process.versions.pnp) {
// Setup the environment to be able to require prettier/index.js
require(absPnpApiPath).setup();
}
}
// Defer to the real prettier/index.js your application uses
module.exports = absRequire(`prettier/index.js`);

View File

@@ -1,6 +0,0 @@
{
"name": "prettier",
"version": "2.6.0-sdk",
"main": "./index.js",
"type": "commonjs"
}

View File

@@ -1,20 +0,0 @@
#!/usr/bin/env node
const {existsSync} = require(`fs`);
const {createRequire, createRequireFromPath} = require(`module`);
const {resolve} = require(`path`);
const relPnpApiPath = "../../../../.pnp.cjs";
const absPnpApiPath = resolve(__dirname, relPnpApiPath);
const absRequire = (createRequire || createRequireFromPath)(absPnpApiPath);
if (existsSync(absPnpApiPath)) {
if (!process.versions.pnp) {
// Setup the environment to be able to require typescript/bin/tsc
require(absPnpApiPath).setup();
}
}
// Defer to the real typescript/bin/tsc your application uses
module.exports = absRequire(`typescript/bin/tsc`);

View File

@@ -1,20 +0,0 @@
#!/usr/bin/env node
const {existsSync} = require(`fs`);
const {createRequire, createRequireFromPath} = require(`module`);
const {resolve} = require(`path`);
const relPnpApiPath = "../../../../.pnp.cjs";
const absPnpApiPath = resolve(__dirname, relPnpApiPath);
const absRequire = (createRequire || createRequireFromPath)(absPnpApiPath);
if (existsSync(absPnpApiPath)) {
if (!process.versions.pnp) {
// Setup the environment to be able to require typescript/bin/tsserver
require(absPnpApiPath).setup();
}
}
// Defer to the real typescript/bin/tsserver your application uses
module.exports = absRequire(`typescript/bin/tsserver`);

View File

@@ -1,20 +0,0 @@
#!/usr/bin/env node
const {existsSync} = require(`fs`);
const {createRequire, createRequireFromPath} = require(`module`);
const {resolve} = require(`path`);
const relPnpApiPath = "../../../../.pnp.cjs";
const absPnpApiPath = resolve(__dirname, relPnpApiPath);
const absRequire = (createRequire || createRequireFromPath)(absPnpApiPath);
if (existsSync(absPnpApiPath)) {
if (!process.versions.pnp) {
// Setup the environment to be able to require typescript/lib/tsc.js
require(absPnpApiPath).setup();
}
}
// Defer to the real typescript/lib/tsc.js your application uses
module.exports = absRequire(`typescript/lib/tsc.js`);

View File

@@ -1,208 +0,0 @@
#!/usr/bin/env node
const {existsSync} = require(`fs`);
const {createRequire, createRequireFromPath} = require(`module`);
const {resolve} = require(`path`);
const relPnpApiPath = "../../../../.pnp.cjs";
const absPnpApiPath = resolve(__dirname, relPnpApiPath);
const absRequire = (createRequire || createRequireFromPath)(absPnpApiPath);
const moduleWrapper = tsserver => {
if (!process.versions.pnp) {
return tsserver;
}
const {isAbsolute} = require(`path`);
const pnpApi = require(`pnpapi`);
const isVirtual = str => str.match(/\/(\$\$virtual|__virtual__)\//);
const isPortal = str => str.startsWith("portal:/");
const normalize = str => str.replace(/\\/g, `/`).replace(/^\/?/, `/`);
const dependencyTreeRoots = new Set(pnpApi.getDependencyTreeRoots().map(locator => {
return `${locator.name}@${locator.reference}`;
}));
// VSCode sends the zip paths to TS using the "zip://" prefix, that TS
// doesn't understand. This layer makes sure to remove the protocol
// before forwarding it to TS, and to add it back on all returned paths.
function toEditorPath(str) {
// We add the `zip:` prefix to both `.zip/` paths and virtual paths
if (isAbsolute(str) && !str.match(/^\^?(zip:|\/zip\/)/) && (str.match(/\.zip\//) || isVirtual(str))) {
// We also take the opportunity to turn virtual paths into physical ones;
// this makes it much easier to work with workspaces that list peer
// dependencies, since otherwise Ctrl+Click would bring us to the virtual
// file instances instead of the real ones.
//
// We only do this to modules owned by the the dependency tree roots.
// This avoids breaking the resolution when jumping inside a vendor
// with peer dep (otherwise jumping into react-dom would show resolution
// errors on react).
//
const resolved = isVirtual(str) ? pnpApi.resolveVirtual(str) : str;
if (resolved) {
const locator = pnpApi.findPackageLocator(resolved);
if (locator && (dependencyTreeRoots.has(`${locator.name}@${locator.reference}`) || isPortal(locator.reference))) {
str = resolved;
}
}
str = normalize(str);
if (str.match(/\.zip\//)) {
switch (hostInfo) {
// Absolute VSCode `Uri.fsPath`s need to start with a slash.
// VSCode only adds it automatically for supported schemes,
// so we have to do it manually for the `zip` scheme.
// The path needs to start with a caret otherwise VSCode doesn't handle the protocol
//
// Ref: https://github.com/microsoft/vscode/issues/105014#issuecomment-686760910
//
// Update 2021-10-08: VSCode changed their format in 1.61.
// Before | ^zip:/c:/foo/bar.zip/package.json
// After | ^/zip//c:/foo/bar.zip/package.json
//
// Update 2022-04-06: VSCode changed the format in 1.66.
// Before | ^/zip//c:/foo/bar.zip/package.json
// After | ^/zip/c:/foo/bar.zip/package.json
//
case `vscode <1.61`: {
str = `^zip:${str}`;
} break;
case `vscode <1.66`: {
str = `^/zip/${str}`;
} break;
case `vscode`: {
str = `^/zip${str}`;
} break;
// To make "go to definition" work,
// We have to resolve the actual file system path from virtual path
// and convert scheme to supported by [vim-rzip](https://github.com/lbrayner/vim-rzip)
case `coc-nvim`: {
str = normalize(resolved).replace(/\.zip\//, `.zip::`);
str = resolve(`zipfile:${str}`);
} break;
// Support neovim native LSP and [typescript-language-server](https://github.com/theia-ide/typescript-language-server)
// We have to resolve the actual file system path from virtual path,
// everything else is up to neovim
case `neovim`: {
str = normalize(resolved).replace(/\.zip\//, `.zip::`);
str = `zipfile://${str}`;
} break;
default: {
str = `zip:${str}`;
} break;
}
}
}
return str;
}
function fromEditorPath(str) {
switch (hostInfo) {
case `coc-nvim`: {
str = str.replace(/\.zip::/, `.zip/`);
// The path for coc-nvim is in format of /<pwd>/zipfile:/<pwd>/.yarn/...
// So in order to convert it back, we use .* to match all the thing
// before `zipfile:`
return process.platform === `win32`
? str.replace(/^.*zipfile:\//, ``)
: str.replace(/^.*zipfile:/, ``);
} break;
case `neovim`: {
str = str.replace(/\.zip::/, `.zip/`);
// The path for neovim is in format of zipfile:///<pwd>/.yarn/...
return str.replace(/^zipfile:\/\//, ``);
} break;
case `vscode`:
default: {
return process.platform === `win32`
? str.replace(/^\^?(zip:|\/zip)\/+/, ``)
: str.replace(/^\^?(zip:|\/zip)\/+/, `/`);
} break;
}
}
// Force enable 'allowLocalPluginLoads'
// TypeScript tries to resolve plugins using a path relative to itself
// which doesn't work when using the global cache
// https://github.com/microsoft/TypeScript/blob/1b57a0395e0bff191581c9606aab92832001de62/src/server/project.ts#L2238
// VSCode doesn't want to enable 'allowLocalPluginLoads' due to security concerns but
// TypeScript already does local loads and if this code is running the user trusts the workspace
// https://github.com/microsoft/vscode/issues/45856
const ConfiguredProject = tsserver.server.ConfiguredProject;
const {enablePluginsWithOptions: originalEnablePluginsWithOptions} = ConfiguredProject.prototype;
ConfiguredProject.prototype.enablePluginsWithOptions = function() {
this.projectService.allowLocalPluginLoads = true;
return originalEnablePluginsWithOptions.apply(this, arguments);
};
// And here is the point where we hijack the VSCode <-> TS communications
// by adding ourselves in the middle. We locate everything that looks
// like an absolute path of ours and normalize it.
const Session = tsserver.server.Session;
const {onMessage: originalOnMessage, send: originalSend} = Session.prototype;
let hostInfo = `unknown`;
Object.assign(Session.prototype, {
onMessage(/** @type {string | object} */ message) {
const isStringMessage = typeof message === 'string';
const parsedMessage = isStringMessage ? JSON.parse(message) : message;
if (
parsedMessage != null &&
typeof parsedMessage === `object` &&
parsedMessage.arguments &&
typeof parsedMessage.arguments.hostInfo === `string`
) {
hostInfo = parsedMessage.arguments.hostInfo;
if (hostInfo === `vscode` && process.env.VSCODE_IPC_HOOK) {
if (/(\/|-)1\.([1-5][0-9]|60)\./.test(process.env.VSCODE_IPC_HOOK)) {
hostInfo += ` <1.61`;
} else if (/(\/|-)1\.(6[1-5])\./.test(process.env.VSCODE_IPC_HOOK)) {
hostInfo += ` <1.66`;
}
}
}
const processedMessageJSON = JSON.stringify(parsedMessage, (key, value) => {
return typeof value === 'string' ? fromEditorPath(value) : value;
});
return originalOnMessage.call(
this,
isStringMessage ? processedMessageJSON : JSON.parse(processedMessageJSON)
);
},
send(/** @type {any} */ msg) {
return originalSend.call(this, JSON.parse(JSON.stringify(msg, (key, value) => {
return typeof value === `string` ? toEditorPath(value) : value;
})));
}
});
return tsserver;
};
if (existsSync(absPnpApiPath)) {
if (!process.versions.pnp) {
// Setup the environment to be able to require typescript/lib/tsserver.js
require(absPnpApiPath).setup();
}
}
// Defer to the real typescript/lib/tsserver.js your application uses
module.exports = moduleWrapper(absRequire(`typescript/lib/tsserver.js`));

View File

@@ -1,208 +0,0 @@
#!/usr/bin/env node
const {existsSync} = require(`fs`);
const {createRequire, createRequireFromPath} = require(`module`);
const {resolve} = require(`path`);
const relPnpApiPath = "../../../../.pnp.cjs";
const absPnpApiPath = resolve(__dirname, relPnpApiPath);
const absRequire = (createRequire || createRequireFromPath)(absPnpApiPath);
const moduleWrapper = tsserver => {
if (!process.versions.pnp) {
return tsserver;
}
const {isAbsolute} = require(`path`);
const pnpApi = require(`pnpapi`);
const isVirtual = str => str.match(/\/(\$\$virtual|__virtual__)\//);
const isPortal = str => str.startsWith("portal:/");
const normalize = str => str.replace(/\\/g, `/`).replace(/^\/?/, `/`);
const dependencyTreeRoots = new Set(pnpApi.getDependencyTreeRoots().map(locator => {
return `${locator.name}@${locator.reference}`;
}));
// VSCode sends the zip paths to TS using the "zip://" prefix, that TS
// doesn't understand. This layer makes sure to remove the protocol
// before forwarding it to TS, and to add it back on all returned paths.
function toEditorPath(str) {
// We add the `zip:` prefix to both `.zip/` paths and virtual paths
if (isAbsolute(str) && !str.match(/^\^?(zip:|\/zip\/)/) && (str.match(/\.zip\//) || isVirtual(str))) {
// We also take the opportunity to turn virtual paths into physical ones;
// this makes it much easier to work with workspaces that list peer
// dependencies, since otherwise Ctrl+Click would bring us to the virtual
// file instances instead of the real ones.
//
// We only do this to modules owned by the the dependency tree roots.
// This avoids breaking the resolution when jumping inside a vendor
// with peer dep (otherwise jumping into react-dom would show resolution
// errors on react).
//
const resolved = isVirtual(str) ? pnpApi.resolveVirtual(str) : str;
if (resolved) {
const locator = pnpApi.findPackageLocator(resolved);
if (locator && (dependencyTreeRoots.has(`${locator.name}@${locator.reference}`) || isPortal(locator.reference))) {
str = resolved;
}
}
str = normalize(str);
if (str.match(/\.zip\//)) {
switch (hostInfo) {
// Absolute VSCode `Uri.fsPath`s need to start with a slash.
// VSCode only adds it automatically for supported schemes,
// so we have to do it manually for the `zip` scheme.
// The path needs to start with a caret otherwise VSCode doesn't handle the protocol
//
// Ref: https://github.com/microsoft/vscode/issues/105014#issuecomment-686760910
//
// Update 2021-10-08: VSCode changed their format in 1.61.
// Before | ^zip:/c:/foo/bar.zip/package.json
// After | ^/zip//c:/foo/bar.zip/package.json
//
// Update 2022-04-06: VSCode changed the format in 1.66.
// Before | ^/zip//c:/foo/bar.zip/package.json
// After | ^/zip/c:/foo/bar.zip/package.json
//
case `vscode <1.61`: {
str = `^zip:${str}`;
} break;
case `vscode <1.66`: {
str = `^/zip/${str}`;
} break;
case `vscode`: {
str = `^/zip${str}`;
} break;
// To make "go to definition" work,
// We have to resolve the actual file system path from virtual path
// and convert scheme to supported by [vim-rzip](https://github.com/lbrayner/vim-rzip)
case `coc-nvim`: {
str = normalize(resolved).replace(/\.zip\//, `.zip::`);
str = resolve(`zipfile:${str}`);
} break;
// Support neovim native LSP and [typescript-language-server](https://github.com/theia-ide/typescript-language-server)
// We have to resolve the actual file system path from virtual path,
// everything else is up to neovim
case `neovim`: {
str = normalize(resolved).replace(/\.zip\//, `.zip::`);
str = `zipfile://${str}`;
} break;
default: {
str = `zip:${str}`;
} break;
}
}
}
return str;
}
function fromEditorPath(str) {
switch (hostInfo) {
case `coc-nvim`: {
str = str.replace(/\.zip::/, `.zip/`);
// The path for coc-nvim is in format of /<pwd>/zipfile:/<pwd>/.yarn/...
// So in order to convert it back, we use .* to match all the thing
// before `zipfile:`
return process.platform === `win32`
? str.replace(/^.*zipfile:\//, ``)
: str.replace(/^.*zipfile:/, ``);
} break;
case `neovim`: {
str = str.replace(/\.zip::/, `.zip/`);
// The path for neovim is in format of zipfile:///<pwd>/.yarn/...
return str.replace(/^zipfile:\/\//, ``);
} break;
case `vscode`:
default: {
return process.platform === `win32`
? str.replace(/^\^?(zip:|\/zip)\/+/, ``)
: str.replace(/^\^?(zip:|\/zip)\/+/, `/`);
} break;
}
}
// Force enable 'allowLocalPluginLoads'
// TypeScript tries to resolve plugins using a path relative to itself
// which doesn't work when using the global cache
// https://github.com/microsoft/TypeScript/blob/1b57a0395e0bff191581c9606aab92832001de62/src/server/project.ts#L2238
// VSCode doesn't want to enable 'allowLocalPluginLoads' due to security concerns but
// TypeScript already does local loads and if this code is running the user trusts the workspace
// https://github.com/microsoft/vscode/issues/45856
const ConfiguredProject = tsserver.server.ConfiguredProject;
const {enablePluginsWithOptions: originalEnablePluginsWithOptions} = ConfiguredProject.prototype;
ConfiguredProject.prototype.enablePluginsWithOptions = function() {
this.projectService.allowLocalPluginLoads = true;
return originalEnablePluginsWithOptions.apply(this, arguments);
};
// And here is the point where we hijack the VSCode <-> TS communications
// by adding ourselves in the middle. We locate everything that looks
// like an absolute path of ours and normalize it.
const Session = tsserver.server.Session;
const {onMessage: originalOnMessage, send: originalSend} = Session.prototype;
let hostInfo = `unknown`;
Object.assign(Session.prototype, {
onMessage(/** @type {string | object} */ message) {
const isStringMessage = typeof message === 'string';
const parsedMessage = isStringMessage ? JSON.parse(message) : message;
if (
parsedMessage != null &&
typeof parsedMessage === `object` &&
parsedMessage.arguments &&
typeof parsedMessage.arguments.hostInfo === `string`
) {
hostInfo = parsedMessage.arguments.hostInfo;
if (hostInfo === `vscode` && process.env.VSCODE_IPC_HOOK) {
if (/(\/|-)1\.([1-5][0-9]|60)\./.test(process.env.VSCODE_IPC_HOOK)) {
hostInfo += ` <1.61`;
} else if (/(\/|-)1\.(6[1-5])\./.test(process.env.VSCODE_IPC_HOOK)) {
hostInfo += ` <1.66`;
}
}
}
const processedMessageJSON = JSON.stringify(parsedMessage, (key, value) => {
return typeof value === 'string' ? fromEditorPath(value) : value;
});
return originalOnMessage.call(
this,
isStringMessage ? processedMessageJSON : JSON.parse(processedMessageJSON)
);
},
send(/** @type {any} */ msg) {
return originalSend.call(this, JSON.parse(JSON.stringify(msg, (key, value) => {
return typeof value === `string` ? toEditorPath(value) : value;
})));
}
});
return tsserver;
};
if (existsSync(absPnpApiPath)) {
if (!process.versions.pnp) {
// Setup the environment to be able to require typescript/lib/tsserverlibrary.js
require(absPnpApiPath).setup();
}
}
// Defer to the real typescript/lib/tsserverlibrary.js your application uses
module.exports = moduleWrapper(absRequire(`typescript/lib/tsserverlibrary.js`));

View File

@@ -1,20 +0,0 @@
#!/usr/bin/env node
const {existsSync} = require(`fs`);
const {createRequire, createRequireFromPath} = require(`module`);
const {resolve} = require(`path`);
const relPnpApiPath = "../../../../.pnp.cjs";
const absPnpApiPath = resolve(__dirname, relPnpApiPath);
const absRequire = (createRequire || createRequireFromPath)(absPnpApiPath);
if (existsSync(absPnpApiPath)) {
if (!process.versions.pnp) {
// Setup the environment to be able to require typescript/lib/typescript.js
require(absPnpApiPath).setup();
}
}
// Defer to the real typescript/lib/typescript.js your application uses
module.exports = absRequire(`typescript/lib/typescript.js`);

View File

@@ -1,6 +0,0 @@
{
"name": "typescript",
"version": "4.6.4-sdk",
"main": "./lib/typescript.js",
"type": "commonjs"
}

View File

@@ -1,81 +0,0 @@
enableTelemetry: false
nodeLinker: pnp
packageExtensions:
"@grafana/slate-react@0.22.10-grafana":
peerDependencies:
slate-react: ">=0.22.0"
"@mdx-js/loader@1.6.22":
peerDependencies:
react: 17.0.1
"@storybook/addon-docs@6.4.21":
peerDependencies:
"@storybook/manager-webpack5": 6.4.21
"@storybook/addon-essentials@6.4.21":
peerDependencies:
"@storybook/components": 6.4.21
"@storybook/core-events": 6.4.21
"@storybook/manager-webpack5": 6.4.21
"@storybook/theming": 6.4.21
"@storybook/core-server@6.4.21":
peerDependencies:
"@babel/core": ^7.0.0
"@storybook/core@6.4.21":
peerDependencies:
"@babel/core": ^7.0.0
"@storybook/manager-webpack5": 6.4.21
"@storybook/csf-tools@6.4.21":
peerDependencies:
"@babel/core": ^7.0.0
"@storybook/react@6.4.21":
peerDependencies:
"@storybook/manager-webpack5": 6.4.21
doctrine@3.0.0:
dependencies:
assert: 2.0.0
moveable@0.29.8:
dependencies:
"@daybrush/utils": 1.6.0
framework-utils: ^1.1.0
rc-time-picker@3.7.3:
peerDependencies:
react: 17.0.1
react-dom: 17.0.1
rc-trigger@2.6.5:
peerDependencies:
react: 17.0.1
react-dom: 17.0.1
react-compat-css-styled@1.0.8:
dependencies:
react-simple-compat: 1.2.1
react-compat-moveable@0.17.8:
dependencies:
"@egjs/agent": ^2.2.1
"@egjs/children-differ": ^1.0.1
"@scena/matrix": 1.1.1
css-to-mat: ^1.0.3
gesto: ^1.7.0
overlap-area: ^1.0.0
react-simple-compat: 1.2.1
peerDependencies:
framework-utils: ^1.1.0
react-docgen-typescript-loader@3.7.2:
peerDependencies:
webpack: 4.41.5
react-icons@2.2.7:
peerDependencies:
prop-types: "*"
react-resizable@3.0.4:
peerDependencies:
react-dom: 17.0.1
plugins:
- path: .yarn/plugins/@yarnpkg/plugin-typescript.cjs
spec: "@yarnpkg/plugin-typescript"
- path: .yarn/plugins/@yarnpkg/plugin-interactive-tools.cjs
spec: "@yarnpkg/plugin-interactive-tools"
- path: .yarn/plugins/@yarnpkg/plugin-outdated.cjs
spec: "https://mskelton.dev/yarn-outdated/v2"
yarnPath: .yarn/releases/yarn-3.2.1.cjs

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -8,19 +8,19 @@ In the interest of fostering an open and welcoming environment, we as contributo
Examples of behavior that contributes to creating a positive environment include:
- Using welcoming and inclusive language
- Being respectful of differing viewpoints and experiences
- Gracefully accepting constructive criticism
- Focusing on what is best for the community
- Showing empathy towards other community members
* Using welcoming and inclusive language
* Being respectful of differing viewpoints and experiences
* Gracefully accepting constructive criticism
* Focusing on what is best for the community
* Showing empathy towards other community members
Examples of unacceptable behavior by participants include:
- The use of sexualized language or imagery and unwelcome sexual attention or advances
- Trolling, insulting/derogatory comments, and personal or political attacks
- Public or private harassment
- Publishing others' private information, such as a physical or electronic address, without explicit permission
- Other conduct which could reasonably be considered inappropriate in a professional setting
* The use of sexualized language or imagery and unwelcome sexual attention or advances
* Trolling, insulting/derogatory comments, and personal or political attacks
* Public or private harassment
* Publishing others' private information, such as a physical or electronic address, without explicit permission
* Other conduct which could reasonably be considered inappropriate in a professional setting
## Our Responsibilities
@@ -34,7 +34,7 @@ This Code of Conduct applies both within project spaces and in public spaces whe
## Enforcement
Instances of abusive, harassing, or otherwise unacceptable behavior may be reported by contacting the project team at conduct@grafana.com. The project team will review and investigate all complaints, and will respond in a way that it deems appropriate to the circumstances. The project team is obligated to maintain confidentiality with regard to the reporter of an incident. Further details of specific enforcement policies may be posted separately.
Instances of abusive, harassing, or otherwise unacceptable behavior may be reported by contacting the project team at contact@grafana.com. The project team will review and investigate all complaints, and will respond in a way that it deems appropriate to the circumstances. The project team is obligated to maintain confidentiality with regard to the reporter of an incident. Further details of specific enforcement policies may be posted separately.
Project maintainers who do not follow or enforce the Code of Conduct in good faith may face temporary or permanent repercussions as determined by other members of the project's leadership.

View File

@@ -19,22 +19,9 @@ For more ways to contribute, check out the [Open Source Guides](https://opensour
### Report bugs
Before submitting a new issue, try to make sure someone hasn't already reported the problem. Look through the [existing issues](https://github.com/grafana/grafana/issues) for similar issues.
Report a bug by submitting a [bug report](https://github.com/grafana/grafana/issues/new?labels=type%3A+bug&template=1-bug_report.md). Make sure that you provide as much information as possible on how to reproduce the bug.
Follow the issue template and add additional information that will help us replicate the problem.
For data visualization issues:
- Query results from the inspect drawer (data tab & query inspector)
- Panel settings can be extracted in the panel inspect drawer JSON tab
For a dashboard related issues:
- Dashboard JSON can be found in the dashboard settings JSON model view
For authentication and alerting Grafana server logs are useful.
Before submitting a new issue, try to make sure someone hasn't already reported the problem. Look through the [existing issues](https://github.com/grafana/grafana/issues) for similar issues.
#### Security issues
@@ -42,14 +29,10 @@ If you believe you've found a security vulnerability, please read our [security
### Suggest enhancements
If you have an idea of how to improve Grafana, submit an [enhancement request](https://github.com/grafana/grafana/discussions/new).
If you have an idea of how to improve Grafana, submit an [enhancement request](https://github.com/grafana/grafana/issues/new?labels=type%3A+feature+request&template=2-feature_request.md).
We want to make Grafana accessible to even more people. Submit an [accessibility issue](https://github.com/grafana/grafana/issues/new?labels=type%3A+accessibility&template=3-accessibility.md) to help us understand what we can improve.
### Write documentation
To edit or write technical content, refer to [Contribute to our documentation](/contribute/documentation/README.md). We welcome your expertise and input as our body of technical content grows.
### Triage issues
If you don't have the knowledge or time to code, consider helping with _issue triage_. The community will thank you for saving them time by spending some of yours.
@@ -75,11 +58,10 @@ When you're ready to contribute, it's time to [Create a pull request](/contribut
#### Contributor License Agreement (CLA)
Before we can accept your pull request, you need to [sign our CLA](https://grafana.com/docs/grafana/latest/developers/cla/). If you haven't, our CLA assistant prompts you to when you create your pull request.
Before we can accept your pull request, you need to [sign our CLA](https://grafana.com/docs/contribute/cla/). If you haven't, our CLA assistant prompts you to when you create your pull request.
## Where do I go from here?
- Set up your [development environment](contribute/developer-guide.md).
- Learn how to [contribute documentation](contribute/README.md).
- Get started [developing plugins](https://grafana.com/docs/grafana/latest/developers/plugins/) for Grafana.
- Look through the resources in the [contribute](contribute) folder.
- Learn how to [contribute documentation](contribute/documentation.md).
- Get started [developing plugins](https://grafana.com/docs/plugins/developing/development/) for Grafana.

View File

@@ -1,92 +1,97 @@
FROM node:16-alpine3.15 as js-builder
# Golang build container
FROM golang:1.13.4-alpine
ENV NODE_OPTIONS=--max_old_space_size=8000
RUN apk add --no-cache gcc g++
WORKDIR /grafana
WORKDIR $GOPATH/src/github.com/grafana/grafana
COPY package.json yarn.lock .yarnrc.yml ./
COPY .yarn .yarn
COPY go.mod go.sum ./
COPY vendor vendor
RUN go mod verify
COPY pkg pkg
COPY build.go package.json ./
RUN go run build.go build
# Node build container
FROM node:12.13.0-alpine
# PhantomJS
RUN apk add --no-cache curl &&\
cd /tmp && curl -Ls https://github.com/dustinblackman/phantomized/releases/download/2.1.1/dockerized-phantomjs.tar.gz | tar xz &&\
cp -R lib lib64 / &&\
cp -R usr/lib/x86_64-linux-gnu /usr/lib &&\
cp -R usr/share /usr/share &&\
cp -R etc/fonts /etc &&\
curl -k -Ls https://bitbucket.org/ariya/phantomjs/downloads/phantomjs-2.1.1-linux-x86_64.tar.bz2 | tar -jxf - &&\
cp phantomjs-2.1.1-linux-x86_64/bin/phantomjs /usr/local/bin/phantomjs
WORKDIR /usr/src/app/
COPY package.json yarn.lock ./
COPY packages packages
COPY plugins-bundled plugins-bundled
RUN yarn install
RUN yarn install --pure-lockfile --no-progress
COPY tsconfig.json .eslintrc .editorconfig .browserslistrc .prettierrc.js babel.config.json .linguirc ./
COPY Gruntfile.js tsconfig.json .eslintrc .editorconfig .browserslistrc ./
COPY public public
COPY tools tools
COPY scripts scripts
COPY emails emails
ENV NODE_ENV production
RUN yarn build
RUN ./node_modules/.bin/grunt build
FROM golang:1.17.11-alpine3.15 as go-builder
RUN apk add --no-cache gcc g++ make
WORKDIR /grafana
COPY go.mod go.sum embed.go Makefile build.go package.json ./
COPY cue cue
COPY packages/grafana-schema packages/grafana-schema
COPY public/app/plugins public/app/plugins
COPY public/api-spec.json public/api-spec.json
COPY pkg pkg
COPY scripts scripts
COPY cue.mod cue.mod
COPY .bingo .bingo
RUN go mod verify
RUN make build-go
# Final stage
FROM alpine:3.15
# Final container
FROM alpine:3.10
LABEL maintainer="Grafana team <hello@grafana.com>"
ARG GF_UID="472"
ARG GF_GID="0"
ARG GF_GID="472"
ENV PATH="/usr/share/grafana/bin:$PATH" \
GF_PATHS_CONFIG="/etc/grafana/grafana.ini" \
GF_PATHS_DATA="/var/lib/grafana" \
GF_PATHS_HOME="/usr/share/grafana" \
GF_PATHS_LOGS="/var/log/grafana" \
GF_PATHS_PLUGINS="/var/lib/grafana/plugins" \
GF_PATHS_PROVISIONING="/etc/grafana/provisioning"
GF_PATHS_CONFIG="/etc/grafana/grafana.ini" \
GF_PATHS_DATA="/var/lib/grafana" \
GF_PATHS_HOME="/usr/share/grafana" \
GF_PATHS_LOGS="/var/log/grafana" \
GF_PATHS_PLUGINS="/var/lib/grafana/plugins" \
GF_PATHS_PROVISIONING="/etc/grafana/provisioning"
WORKDIR $GF_PATHS_HOME
RUN apk add --no-cache ca-certificates bash tzdata musl-utils
RUN apk add --no-cache openssl ncurses-libs ncurses-terminfo-base --repository=http://dl-cdn.alpinelinux.org/alpine/edge/main
RUN apk upgrade ncurses-libs ncurses-terminfo-base --repository=http://dl-cdn.alpinelinux.org/alpine/edge/main
RUN apk info -vv | sort
RUN apk add --no-cache ca-certificates bash tzdata && \
apk add --no-cache --upgrade --repository=http://dl-cdn.alpinelinux.org/alpine/edge/main openssl musl-utils
COPY conf ./conf
RUN if [ ! $(getent group "$GF_GID") ]; then \
addgroup -S -g $GF_GID grafana; \
fi
RUN mkdir -p "$GF_PATHS_HOME/.aws" && \
addgroup -S -g $GF_GID grafana && \
adduser -S -u $GF_UID -G grafana grafana && \
mkdir -p "$GF_PATHS_PROVISIONING/datasources" \
"$GF_PATHS_PROVISIONING/dashboards" \
"$GF_PATHS_PROVISIONING/notifiers" \
"$GF_PATHS_LOGS" \
"$GF_PATHS_PLUGINS" \
"$GF_PATHS_DATA" && \
cp "$GF_PATHS_HOME/conf/sample.ini" "$GF_PATHS_CONFIG" && \
cp "$GF_PATHS_HOME/conf/ldap.toml" /etc/grafana/ldap.toml && \
chown -R grafana:grafana "$GF_PATHS_DATA" "$GF_PATHS_HOME/.aws" "$GF_PATHS_LOGS" "$GF_PATHS_PLUGINS" "$GF_PATHS_PROVISIONING" && \
chmod -R 777 "$GF_PATHS_DATA" "$GF_PATHS_HOME/.aws" "$GF_PATHS_LOGS" "$GF_PATHS_PLUGINS" "$GF_PATHS_PROVISIONING"
RUN export GF_GID_NAME=$(getent group $GF_GID | cut -d':' -f1) && \
mkdir -p "$GF_PATHS_HOME/.aws" && \
adduser -S -u $GF_UID -G "$GF_GID_NAME" grafana && \
mkdir -p "$GF_PATHS_PROVISIONING/datasources" \
"$GF_PATHS_PROVISIONING/dashboards" \
"$GF_PATHS_PROVISIONING/notifiers" \
"$GF_PATHS_PROVISIONING/plugins" \
"$GF_PATHS_PROVISIONING/access-control" \
"$GF_PATHS_LOGS" \
"$GF_PATHS_PLUGINS" \
"$GF_PATHS_DATA" && \
cp "$GF_PATHS_HOME/conf/sample.ini" "$GF_PATHS_CONFIG" && \
cp "$GF_PATHS_HOME/conf/ldap.toml" /etc/grafana/ldap.toml && \
chown -R "grafana:$GF_GID_NAME" "$GF_PATHS_DATA" "$GF_PATHS_HOME/.aws" "$GF_PATHS_LOGS" "$GF_PATHS_PLUGINS" "$GF_PATHS_PROVISIONING" && \
chmod -R 777 "$GF_PATHS_DATA" "$GF_PATHS_HOME/.aws" "$GF_PATHS_LOGS" "$GF_PATHS_PLUGINS" "$GF_PATHS_PROVISIONING"
# PhantomJS
COPY --from=1 /tmp/lib /lib
COPY --from=1 /tmp/lib64 /lib64
COPY --from=1 /tmp/usr/lib/x86_64-linux-gnu /usr/lib/x86_64-linux-gnu
COPY --from=1 /tmp/usr/share /usr/share
COPY --from=1 /tmp/etc/fonts /etc/fonts
COPY --from=1 /usr/local/bin/phantomjs /usr/local/bin
COPY --from=go-builder /grafana/bin/*/grafana-server /grafana/bin/*/grafana-cli ./bin/
COPY --from=js-builder /grafana/public ./public
COPY --from=js-builder /grafana/tools ./tools
COPY --from=0 /go/src/github.com/grafana/grafana/bin/linux-amd64/grafana-server /go/src/github.com/grafana/grafana/bin/linux-amd64/grafana-cli ./bin/
COPY --from=1 /usr/src/app/public ./public
COPY --from=1 /usr/src/app/tools ./tools
COPY tools/phantomjs/render.js ./tools/phantomjs/render.js
EXPOSE 3000

Some files were not shown because too many files have changed in this diff Show More