Compare commits

...

37 Commits

Author SHA1 Message Date
Hugo Häggmark
443a0ba78e Chore: Fixes lines that exceeded 150 chars 2019-10-08 02:10:35 -07:00
Hugo Häggmark
21bbb7530c CherryPicks for 6.4.2 2019-10-08 02:10:35 -07:00
Marcus Efraimsson
3478088482 Table: Proper handling of json data with dataframes (#19596)
When using Raw Document query with Elasticsearch there's a special
response from datasource that is used which includes a type field with
the value json. In the table panel there is a transformation for JSON
data which up until this fix didn't work at all due to the new data
structure we call data frames.

Co-Authored-By: Hugo Häggmark <hugo.haggmark@grafana.com>

Fixes #19531

(cherry picked from commit 0ad2242fb8)
2019-10-08 02:10:35 -07:00
Torkel Ödegaard
d17fb21ecc SharedQuery: Fixed issue when using rows (#19610)
(cherry picked from commit 7c2ed5c1fc)
2019-10-08 02:10:35 -07:00
Hugo Häggmark
d94eaea64e SingleStat: Fixes $__name postfix/prefix usage (#19687)
Fixes #19567

(cherry picked from commit 58badd70b0)
2019-10-08 02:10:35 -07:00
Hugo Häggmark
45971205b0 Grafana Image Renderer: Fixes plugin page (#19664)
Fixes #19659

(cherry picked from commit 5202770bdc)
2019-10-08 02:10:35 -07:00
Anthony Templeton
3d95eea6ba CloudWatch: Changes incorrect dimension wmlid to wlmid (#19679)
Fixes #19476

(cherry picked from commit 6c0b5163dd)
2019-10-08 02:10:35 -07:00
Andrej Ocenas
54a092e0a1 Loki: Fix lookup for label key token (#19579)
(cherry picked from commit 5238faf6da)
2019-10-08 02:10:35 -07:00
David
4437f8af26 Rename live option in queries (#19658)
(cherry picked from commit cf7ace6aad)
2019-10-08 02:10:35 -07:00
Torkel Ödegaard
35213f192c DataFormats: When transforming TableModel -> DataFrame -> Table preserve the type attribute (#19621)
(cherry picked from commit 99c1c16a04)
2019-10-08 02:10:35 -07:00
Torkel Ödegaard
1006650ae4 Graph: Updated auto decimals logic and test dashboard (#19618)
(cherry picked from commit 6f0faa595b)
2019-10-08 02:10:35 -07:00
Torkel Ödegaard
f1225330e2 Graph: Switching to series mode should re-render graph (#19623)
(cherry picked from commit 0016189f28)
2019-10-08 02:10:35 -07:00
andreaslangnevyjel
4f888d9660 Units: fixed wrong id for Terabits/sec (#19611)
(cherry picked from commit 45e0ebcc57)
2019-10-08 02:10:35 -07:00
Ha Huynh
969c60e87c Profile: Fix issue with user profile not showing more than sessions some times (#19578)
* fix <react-profile-wrapper> crashing

* refix crashing profile page

(cherry picked from commit 4b042c89fe)
2019-10-08 02:10:35 -07:00
Sofia Papagiannaki
931dd93d91 Login: Show SAML login button if SAML is enabled (#19591)
* Show SAML login button if SAML is enabled

Move logic inside LoginServiceButtons

* Prevent from rendering login-oauth div if no login service is enabled

(cherry picked from commit a62dea47b4)
2019-10-08 02:10:35 -07:00
Hugo Häggmark
34a172e133 Prometheus: Fixes so results in Panel always are sorted by query order (#19597)
Fixes #19529

(cherry picked from commit f9611250ea)
2019-10-08 02:10:35 -07:00
David
9b764e3a20 Loki: remove live option for logs panel (#19533)
* Loki: remote live option for logs panel

* Remove live from logs panel docs

(cherry picked from commit 942f702d80)
2019-10-08 02:10:35 -07:00
gotjosh
23aa9b6e45 LDAP: Show non-matched groups returned from LDAP (#19208)
* LDAP: Show all LDAP groups

* Use the returned LDAP groups as the reference when debugging LDAP

We need to use the LDAP groups returned as the main reference for
assuming what we were able to match and what wasn't. Before, we were
using the configured groups in LDAP TOML configuration file.

* s/User name/Username

* Add a title to for the LDAP mapping results

* LDAP: UI Updates to debug view

* LDAP: Make it explicit when we weren't able to match teams

(cherry picked from commit b20a258b72)
2019-10-08 02:10:35 -07:00
Torkel Ödegaard
4ba8388f3a Cherry picks for v6.4.1 (#19554)
* Provisioning: Handle empty nested keys on YAML provisioning datasources (#19547)

* Fix: Handle empty nested keys on YAML provisioning datasources

As we provision a datasource via a YAML file, we attempt to transform the
file into sensible Go types that the provisioning code can use.

While this happens, there is a chance some of the keys nested within
the YAML array are empty.

This fix allows the YAML parser to handle empty keys by null checking
the return of `reflect.TypeOf` which according to the documentation:

> TypeOf returns the reflection Type that represents the dynamic type of i. If i is a nil interface value, TypeOf returns nil.

Can return nil.

* Add tests

(cherry picked from commit 8e508e5ce4)

* Updated version to 6.4.1
2019-10-02 09:16:20 +02:00
Peter Holmberg
c3b3ad4380 grafana/ui: Add Timezone picker (#19364)
* first things and story

* fixed data structure and fixed picker

* remove console log

* move variables into global scope

(cherry picked from commit bb0a438705)
2019-10-01 02:37:48 -07:00
Hugo Häggmark
bef64b046c release 6.4.0 2019-10-01 02:37:48 -07:00
Torkel Ödegaard
4edafb7c8c Panels: Skip re-rendering panel/visualisation in loading state (#19518)
* Loading states and partial rendering, set loading state in mixed data source, and do not render loading states for react panels

* Updated mixed data source tests

(cherry picked from commit 0ec8303878)
2019-10-01 02:37:48 -07:00
Torkel Ödegaard
3c0268d671 SeriesOverrides: Fixed issue with color picker
(cherry picked from commit c712b4f824)
2019-10-01 02:37:48 -07:00
David
126296826b Logs: Publish logs panel (#19504)
* Logs: Publish logs panel

- remove alpha state from plugins definition
- add panel documentation
- updated panel reference in Loki docs

* Review feedback

(cherry picked from commit 265669710c)
2019-10-01 02:37:48 -07:00
Ivana Huckova
dd75bb67bb Explore: Update broken link to logql docs (#19510)
* Explore: Update broken link to logql docs

* Explore: Remove console logs

* Explore: Add filter expression heading to link target

(cherry picked from commit 9b5bc819f4)
2019-10-01 02:37:48 -07:00
Marcus Efraimsson
c31f39ca11 Build: Upgrade go to 1.12.10 (#19499)
Fixes #19451

(cherry picked from commit d65a3318ab)
2019-10-01 02:37:48 -07:00
Andrej Ocenas
e17af53428 CLI: Fix version selection for plugin install (#19498)
(cherry picked from commit 3866814ea9)
2019-10-01 02:37:48 -07:00
Dominik Prokop
4d1617c1dd grafana/toolkit: Remove hack to expose plugin/e2e exports & types (#19467)
(cherry picked from commit 1b5e7ceee7)
2019-10-01 02:37:48 -07:00
Alexander Zobnin
3cb8b896dd Users: revert LDAP admin user page (#19463)
(cherry picked from commit 3520db1c66)
2019-10-01 02:37:48 -07:00
Ivana Huckova
943f661a75 Explore: Take root_url setting into account when redirecting from dashboard to explore (#19447)
* Explore: Take root_url setting into account when redirecting from dashboard to explore

* Explore: Move adding of subath to getExploreUrl function

* Explore: Fix explore redirect for key bindings

(cherry picked from commit 40fbea977e)
2019-10-01 02:37:48 -07:00
Peter Holmberg
b2c1473e59 grafana/ui: Fix value time zone names to be capitalized (#19417)
(cherry picked from commit 8024c39435)
2019-10-01 02:37:48 -07:00
Dominik Prokop
052ea8f63b Release: Make sure packages are released from clean git state (#19402)
(cherry picked from commit dadc2925a2)
2019-10-01 02:37:48 -07:00
Dominik Prokop
38e88083a3 DataLinks: suggestions menu improvements (#19396)
* Deduplicate series labels in datalinks variables suggestions

* Allways show all variables available in datalinks suggestions

(cherry picked from commit 97beb26f0c)
2019-10-01 02:37:48 -07:00
Hugo Häggmark
6232cfcdda PanelData: Adds timeRange prop to PanelData (#19361)
* Refactor: Adds newTimeRange property to PanelData

* Refactor: Handles timeRange prop after requests

* Refactor: Makes timeRange mandatory

* Refactor: Adds DefaultTimeRange

(cherry picked from commit 889f8e3131)
2019-10-01 02:37:48 -07:00
Torkel Ödegaard
aa7659d1dd Build: fixed signing script issue with circle-ci (#19397)
(cherry picked from commit 680a22b898)
2019-09-25 11:56:22 +02:00
Marcus Efraimsson
199031a6e2 Cherry picks for v6.4.0-beta2 (#19378)
* API: adds redirect helper to simplify http redirects (#19180)

(cherry picked from commit dd794625dd)

* Dashboard: Fixes back button styles in kiosk mode (#19165)

Fixes: #18114
(cherry picked from commit 38e948a1ad)

* Menu: fix menu button in the mobile view (#19191)

* replace "sandwich" (menu) button with logo(back home) if kiosk=tv
* update navbar initialize padding-left befause menu button is overlapped by the navbar
(cherry picked from commit 5ef40b259d)

* LDAP debug page: deduplicate errors (#19168)

(cherry picked from commit 6b2e95a1f2)

* MSSQL: Revert usage of new connectionstring format (#19203)

This reverts commit 2514209 from #18384. Reason is that it doesn't
work due to xorm 0.7.1 which doesn't support this new connectionstring
format.

Fixes #19189
Ref #18384
Ref #17665
(cherry picked from commit 0f524fc947)

* Docker: Upgrade packages to resolve reported vulnerabilities (#19188)

Fixes #19186
(cherry picked from commit 4d96bc590f)

* FieldDisplay: Update title variable syntax (#19217)

(cherry picked from commit 14f1cf29f0)

* Cloudwatch: Fix autocomplete for Gamelift dimensions (#19145) (#19146)

(cherry picked from commit 79f8433675)

* grafana/ui: Add disabled prop on LinkButton (#19192)

(cherry picked from commit f445369d68)

* plugins: expose whole rxjs to plugins (#19226)

(cherry picked from commit 98c95a8a83)

* Snapshots: store DataFrameDTO instead of MutableDataFrame in snapshot data (#19247)

(cherry picked from commit be8097fca2)

* grafana/toolkit: Add plugin scaffolding (#19207)

(cherry picked from commit 54ebf174a0)

* Alerting: Truncate PagerDuty summary when greater than 1024 characters (#18730)

Requests to PagerDuty fail with an HTTP 400 if the `summary`
attribute contains more than 1024 characters, this fixes this.
API spec:
https://v2.developer.pagerduty.com/docs/send-an-event-events-api-v2

Fixes #18727
(cherry picked from commit 8a991244d5)

* grafana/toolkit: Fix toolkit not building @grafana/toolkit (#19253)

* Fix toolkit not building

Weird TS didn't pick this up...

* Update packages/grafana-toolkit/src/cli/index.ts

(cherry picked from commit 809e2ca3c7)

* Docs: Update theming docs (#19248)

(cherry picked from commit 9feac7753b)

* Explore: live tail UI fixes and improvements (#19187)

(cherry picked from commit bf24cbba76)

* Graphite: Changed range expansion from 1m to 1s (#19246)

Fixes #11472
(cherry picked from commit d95318b325)

* MySQL, Postgres, MSSQL: Only debug log when in development (#19239)

Found some additional debug statements in relation to #19049 that
can cause memory issues.

Ref #19049
(cherry picked from commit 19f3ec4891)

* Vector: remove toJSON() from interface (#19254)

(cherry picked from commit 6787e7b5ab)

* Update changelog task to generate toolkit changelog too (#19262)

(cherry picked from commit b7752b8c02)

* Dashboard: Hides alpha icon for visualization that is not in alpha/beta stage #19300

Fixes #19251
(cherry picked from commit f01836c17a)

* Build: Split up task in the CI pipeline to ease running outside circleci (#18861)

* build: make sign rpm packages not depend on checking out private key

* build: move commands from circleci config into verify signed packages script

* build: split update and publish of deb and rpm into two scripts

* use files argument for sign and verify packages

* validate files argument for sign and verify packages

* update test publish of deb/rpm readme

(cherry picked from commit 4386604751)

* Admin/user: fix textarea postion in 'Pending Invites' to avoid page scrolling (#19288)

* hide textarea element after click 'Copy Invite' button on firefox
(cherry picked from commit 50b4695cf5)

* Alerting: Prevents creating alerts from unsupported queries (#19250)

* Refactor: Makes PanelEditor use state and shows validation message on AlerTab

* Refactor: Makes validation message nicer looking

* Refactor: Changes imports

* Refactor: Removes conditional props

* Refactor: Changes after feedback from PR review

* Refactor: Removes unused action

(cherry picked from commit 9bd6ed887c)

* Chore: Update Slate to 0.47.8 (#19197)

* Chore: Update Slate to 0.47.8
Closes #17430

(cherry picked from commit 68d6da77da)

* DataLinks: Small UX improvements to DataLinksInput (#19313)

Closes #19257
(cherry picked from commit feb6bc6747)

* Multi-LDAP: Do not fail-fast on invalid credentials (#19261)

* Multi-LDAP: Do not fail-fast on invalid credentials

When configuring LDAP authentication, it is very common to have multiple
servers configured. When using user bind (authenticating with LDAP using
the same credentials as the user authenticating to Grafana) we don't
expect all the users to be on all LDAP servers.

Because of this use-case, we should not fail-fast when authenticating on
multiple LDAP server configurations. Instead, we should continue to try
the credentials with the next LDAP server configured.

Fixes #19066
(cherry picked from commit 279249ef56)

* Explore: Fix unsubscribing from Loki websocket (#19263)

(cherry picked from commit 4c1bc59889)

* Plugins: Skips existence of module.js for renderer plugins (#19318)

* Fix: Skips test for module.js for plugins of renderer type
Fixes #19130

* Refactor: Changes after PR comments

* Chore: Fixes go lint issue

(cherry picked from commit 75dcaecc99)

* Keybindings: Improve esc / exit / blur logic (#19320)

* Keybindings: Improve esc / exit / blur logic

* Slight modifications

* removed use of jquery

(cherry picked from commit 08cc4f0c8a)

* Select: Set placeholder color (#19309)

(cherry picked from commit 2c9577fcc5)

* Azure Monitor: Revert support for cross resource queries (#19115)" (#19346)

This reverts commit 88051258e9.
(cherry picked from commit 4dbedb8405)

* Dashboard: Fix export for sharing when panels use default data source (#19315)

* PanelModel: moved datasource: null away from defaults that are removed

* Added unit test

(cherry picked from commit ac3fb6452d)

* Heatmap: use DataFrame rather than LegacyResponseData (#19026)

* merge master

* TimeSeries: datasources with labels should export tags (not labels) (#18977)

* merge master

* export prometheus tags

* Annotations: Add annotations support to Loki (#18949)

* Explore: Unify background color for fresh logs (#18973)

* Singlestat: render lines on the panel when sparklines are enabled (#18984)

* Image rendering: Add deprecation warning when PhantomJS is used for rendering images (#18933)

* Add deprecation warning

* Update pkg/services/rendering/rendering.go

Co-Authored-By: Marcus Efraimsson <marcus.efraimsson@gmail.com>

* Units: Adding T,P,E,Z,and Y bytes (#18706)

* Adding T and P for bytes

Luckily, all the hard work was done before; just added in these prefixes for our production environment.

* Future-proofing with other values (why not?)

* Yottaflops?

* Cutting back down to Peta sizes, except for hashes

* Refactor: move ScopedVars to grafana/data (#18992)

* Refactor: Move sql_engine to sub package of tsdb (#18991)

this way importing the tsdb package does not come with xorm dependencies

* use DataFrame in heatmaps

* actually use the setting :)

* remove unused timeSrv

* merge with master / useDataFrames

* fix test function

* merge master

* fix datasource type on snapshot

* reuse DataFrame calcs from graph panel

* update comments

(cherry picked from commit 2474511d03)

* Explore: Do not send explicit maxDataPoints for logs. (#19235)

(cherry picked from commit f203e82b40)

* MySQL, Postgres, MSSQL: Fix validating query with template variables in alert  (#19237)

Adds support for validating query in alert for mysql,
postgres and mssql.

Fixes #13155
(cherry picked from commit 96046a7ba6)

* MySQL, Postgres: Update raw sql when query builder updates (#19209)

Raw sql now updates when changing query using
graphical query editor for mysql and postgres.

Fixes #19063
(cherry picked from commit 7c499ffdd8)

* MySQL: Limit datasource error details returned from the backend (#19373)

Only return certain mysql errors from backend.
The following errors is returned as is from backend:
error code 1064 (parse error)
error code 1054 (bad column/field selected)
error code 1146 (table not exists)
Any other errors is logged and returned as a generic
error.
Restrict use of certain functions:
Do not allow usage of the following in query:
system_user()
session_user()
current_user() or current_user
user()
show grants

Fixes #19360
(cherry picked from commit 3de693af49)

* SQL: Rewrite statistics query (#19178)

* Rewrite statistics query
(cherry picked from commit 56f5106717)

* Release v6.4.0-beta2

* ValueFormats: check for inf (#19376)


(cherry picked from commit 32b73bb496)

* Build: Fix correct sort order of merged pr's in cherrypick task (#19379)


(cherry picked from commit c4a03f482c)
2019-09-25 09:49:55 +02:00
Dominik Prokop
10d47ab095 release 6.4.0-beta.1 2019-09-17 13:29:14 +02:00
260 changed files with 6497 additions and 4187 deletions

View File

@@ -19,7 +19,7 @@ version: 2
jobs: jobs:
mysql-integration-test: mysql-integration-test:
docker: docker:
- image: circleci/golang:1.12.9 - image: circleci/golang:1.12.10
- image: circleci/mysql:5.6-ram - image: circleci/mysql:5.6-ram
environment: environment:
MYSQL_ROOT_PASSWORD: rootpass MYSQL_ROOT_PASSWORD: rootpass
@@ -39,7 +39,7 @@ jobs:
postgres-integration-test: postgres-integration-test:
docker: docker:
- image: circleci/golang:1.12.9 - image: circleci/golang:1.12.10
- image: circleci/postgres:9.3-ram - image: circleci/postgres:9.3-ram
environment: environment:
POSTGRES_USER: grafanatest POSTGRES_USER: grafanatest
@@ -58,7 +58,7 @@ jobs:
cache-server-test: cache-server-test:
docker: docker:
- image: circleci/golang:1.12.9 - image: circleci/golang:1.12.10
- image: circleci/redis:4-alpine - image: circleci/redis:4-alpine
- image: memcached - image: memcached
working_directory: /go/src/github.com/grafana/grafana working_directory: /go/src/github.com/grafana/grafana
@@ -144,7 +144,7 @@ jobs:
lint-go: lint-go:
docker: docker:
- image: circleci/golang:1.12.9 - image: circleci/golang:1.12.10
environment: environment:
# we need CGO because of go-sqlite3 # we need CGO because of go-sqlite3
CGO_ENABLED: 1 CGO_ENABLED: 1
@@ -185,7 +185,7 @@ jobs:
test-backend: test-backend:
docker: docker:
- image: circleci/golang:1.12.9 - image: circleci/golang:1.12.10
working_directory: /go/src/github.com/grafana/grafana working_directory: /go/src/github.com/grafana/grafana
steps: steps:
- checkout - checkout
@@ -195,7 +195,7 @@ jobs:
build-all: build-all:
docker: docker:
- image: grafana/build-container:1.2.8 - image: grafana/build-container:1.2.9
working_directory: /go/src/github.com/grafana/grafana working_directory: /go/src/github.com/grafana/grafana
steps: steps:
- checkout - checkout
@@ -214,15 +214,15 @@ jobs:
- run: - run:
name: build and package grafana name: build and package grafana
command: './scripts/build/build-all.sh' command: './scripts/build/build-all.sh'
- run:
name: Prepare GPG private key
command: './scripts/build/prepare_signing_key.sh'
- run: - run:
name: sign packages name: sign packages
command: './scripts/build/sign_packages.sh' command: './scripts/build/sign_packages.sh dist/*.rpm'
- run: - run:
name: verify signed packages name: verify signed packages
command: | command: './scripts/build/verify_signed_packages.sh dist/*.rpm'
mkdir -p ~/.rpmdb/pubkeys
curl -s https://packages.grafana.com/gpg.key > ~/.rpmdb/pubkeys/grafana.key
./scripts/build/verify_signed_packages.sh dist/*.rpm
- run: - run:
name: sha-sum packages name: sha-sum packages
command: 'go run build.go sha-dist' command: 'go run build.go sha-dist'
@@ -239,7 +239,7 @@ jobs:
build: build:
docker: docker:
- image: grafana/build-container:1.2.8 - image: grafana/build-container:1.2.9
working_directory: /go/src/github.com/grafana/grafana working_directory: /go/src/github.com/grafana/grafana
steps: steps:
- checkout - checkout
@@ -249,9 +249,12 @@ jobs:
- run: - run:
name: build and package grafana name: build and package grafana
command: './scripts/build/build.sh' command: './scripts/build/build.sh'
- run:
name: Prepare GPG private key
command: './scripts/build/prepare_signing_key.sh'
- run: - run:
name: sign packages name: sign packages
command: './scripts/build/sign_packages.sh' command: './scripts/build/sign_packages.sh dist/*.rpm'
- run: - run:
name: sha-sum packages name: sha-sum packages
command: 'go run build.go sha-dist' command: 'go run build.go sha-dist'
@@ -265,7 +268,7 @@ jobs:
build-fast-backend: build-fast-backend:
docker: docker:
- image: grafana/build-container:1.2.8 - image: grafana/build-container:1.2.9
working_directory: /go/src/github.com/grafana/grafana working_directory: /go/src/github.com/grafana/grafana
steps: steps:
- checkout - checkout
@@ -282,7 +285,7 @@ jobs:
build-fast-frontend: build-fast-frontend:
docker: docker:
- image: grafana/build-container:1.2.8 - image: grafana/build-container:1.2.9
working_directory: /go/src/github.com/grafana/grafana working_directory: /go/src/github.com/grafana/grafana
steps: steps:
- checkout - checkout
@@ -306,7 +309,7 @@ jobs:
build-fast-package: build-fast-package:
docker: docker:
- image: grafana/build-container:1.2.8 - image: grafana/build-container:1.2.9
working_directory: /go/src/github.com/grafana/grafana working_directory: /go/src/github.com/grafana/grafana
steps: steps:
- checkout - checkout
@@ -333,7 +336,7 @@ jobs:
build-fast-save: build-fast-save:
docker: docker:
- image: grafana/build-container:1.2.8 - image: grafana/build-container:1.2.9
working_directory: /go/src/github.com/grafana/grafana working_directory: /go/src/github.com/grafana/grafana
steps: steps:
- checkout - checkout
@@ -360,9 +363,12 @@ jobs:
- run: - run:
name: package grafana name: package grafana
command: './scripts/build/build.sh --fast --package-only' command: './scripts/build/build.sh --fast --package-only'
- run:
name: Prepare GPG private key
command: './scripts/build/prepare_signing_key.sh'
- run: - run:
name: sign packages name: sign packages
command: './scripts/build/sign_packages.sh' command: './scripts/build/sign_packages.sh dist/*.rpm'
- run: - run:
name: sha-sum packages name: sha-sum packages
command: 'go run build.go sha-dist' command: 'go run build.go sha-dist'
@@ -419,7 +425,7 @@ jobs:
build-enterprise: build-enterprise:
docker: docker:
- image: grafana/build-container:1.2.8 - image: grafana/build-container:1.2.9
working_directory: /go/src/github.com/grafana/grafana working_directory: /go/src/github.com/grafana/grafana
steps: steps:
- checkout - checkout
@@ -435,9 +441,12 @@ jobs:
- run: - run:
name: build and package enterprise name: build and package enterprise
command: './scripts/build/build.sh -enterprise' command: './scripts/build/build.sh -enterprise'
- run:
name: Prepare GPG private key
command: './scripts/build/prepare_signing_key.sh'
- run: - run:
name: sign packages name: sign packages
command: './scripts/build/sign_packages.sh' command: './scripts/build/sign_packages.sh dist/*.rpm'
- run: - run:
name: sha-sum packages name: sha-sum packages
command: 'go run build.go sha-dist' command: 'go run build.go sha-dist'
@@ -451,7 +460,7 @@ jobs:
build-all-enterprise: build-all-enterprise:
docker: docker:
- image: grafana/build-container:1.2.8 - image: grafana/build-container:1.2.9
working_directory: /go/src/github.com/grafana/grafana working_directory: /go/src/github.com/grafana/grafana
steps: steps:
- checkout - checkout
@@ -476,15 +485,15 @@ jobs:
- run: - run:
name: build and package grafana name: build and package grafana
command: './scripts/build/build-all.sh -enterprise' command: './scripts/build/build-all.sh -enterprise'
- run:
name: Prepare GPG private key
command: './scripts/build/prepare_signing_key.sh'
- run: - run:
name: sign packages name: sign packages
command: './scripts/build/sign_packages.sh' command: './scripts/build/sign_packages.sh dist/*.rpm'
- run: - run:
name: verify signed packages name: verify signed packages
command: | command: './scripts/build/verify_signed_packages.sh dist/*.rpm'
mkdir -p ~/.rpmdb/pubkeys
curl -s https://packages.grafana.com/gpg.key > ~/.rpmdb/pubkeys/grafana.key
./scripts/build/verify_signed_packages.sh dist/*.rpm
- run: - run:
name: sha-sum packages name: sha-sum packages
command: 'go run build.go sha-dist' command: 'go run build.go sha-dist'
@@ -537,15 +546,24 @@ jobs:
- run: - run:
name: Deploy to Grafana.com name: Deploy to Grafana.com
command: './scripts/build/publish.sh --enterprise' command: './scripts/build/publish.sh --enterprise'
- run:
name: Prepare GPG private key
command: './scripts/build/prepare_signing_key.sh'
- run: - run:
name: Load GPG private key name: Load GPG private key
command: './scripts/build/load-signing-key.sh' command: './scripts/build/update_repo/load-signing-key.sh'
- run: - run:
name: Update Debian repository name: Update Debian repository
command: './scripts/build/update_repo/update-deb.sh "enterprise" "$GPG_KEY_PASSWORD" "$CIRCLE_TAG" "enterprise-dist"' command: './scripts/build/update_repo/update-deb.sh "enterprise" "$GPG_KEY_PASSWORD" "$CIRCLE_TAG" "enterprise-dist"'
- run:
name: Publish Debian repository
command: './scripts/build/update_repo/publish-deb.sh "enterprise"'
- run: - run:
name: Update RPM repository name: Update RPM repository
command: './scripts/build/update_repo/update-rpm.sh "enterprise" "$GPG_KEY_PASSWORD" "$CIRCLE_TAG" "enterprise-dist"' command: './scripts/build/update_repo/update-rpm.sh "enterprise" "$GPG_KEY_PASSWORD" "$CIRCLE_TAG" "enterprise-dist"'
- run:
name: Publish RPM repository
command: './scripts/build/update_repo/publish-rpm.sh "enterprise" "$CIRCLE_TAG"'
deploy-master: deploy-master:
@@ -591,15 +609,24 @@ jobs:
- run: - run:
name: Deploy to Grafana.com name: Deploy to Grafana.com
command: './scripts/build/publish.sh' command: './scripts/build/publish.sh'
- run:
name: Prepare GPG private key
command: './scripts/build/prepare_signing_key.sh'
- run: - run:
name: Load GPG private key name: Load GPG private key
command: './scripts/build/load-signing-key.sh' command: './scripts/build/update_repo/load-signing-key.sh'
- run: - run:
name: Update Debian repository name: Update Debian repository
command: './scripts/build/update_repo/update-deb.sh "oss" "$GPG_KEY_PASSWORD" "$CIRCLE_TAG" "dist"' command: './scripts/build/update_repo/update-deb.sh "oss" "$GPG_KEY_PASSWORD" "$CIRCLE_TAG" "dist"'
- run:
name: Publish Debian repository
command: './scripts/build/update_repo/publish-deb.sh "oss"'
- run: - run:
name: Update RPM repository name: Update RPM repository
command: './scripts/build/update_repo/update-rpm.sh "oss" "$GPG_KEY_PASSWORD" "$CIRCLE_TAG" "dist"' command: './scripts/build/update_repo/update-rpm.sh "oss" "$GPG_KEY_PASSWORD" "$CIRCLE_TAG" "dist"'
- run:
name: Publish RPM repository
command: './scripts/build/update_repo/publish-rpm.sh "oss" "$CIRCLE_TAG"'
build-oss-msi: build-oss-msi:
docker: docker:

View File

@@ -1,5 +1,5 @@
# Golang build container # Golang build container
FROM golang:1.12.9-alpine FROM golang:1.12.10-alpine
RUN apk add --no-cache gcc g++ RUN apk add --no-cache gcc g++
@@ -62,7 +62,8 @@ ENV PATH=/usr/share/grafana/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bi
WORKDIR $GF_PATHS_HOME WORKDIR $GF_PATHS_HOME
RUN apk add --no-cache ca-certificates bash RUN apk add --no-cache ca-certificates bash && \
apk add --no-cache --upgrade --repository=http://dl-cdn.alpinelinux.org/alpine/edge/main openssl musl-utils
COPY conf ./conf COPY conf ./conf

File diff suppressed because it is too large Load Diff

View File

@@ -38,10 +38,7 @@ Just add it as a datasource and you are ready to query your log data in [Explore
## Querying Logs ## Querying Logs
Querying and displaying log data from Loki is available via [Explore](/features/explore). Querying and displaying log data from Loki is available via [Explore](/features/explore), and with the [logs panel](/features/panels/logs/) in dashboards. Select the Loki data source, and then enter a log query to display your logs.
Select the Loki data source, and then enter a log query to display your logs.
> Viewing Loki data in dashboard panels is not supported yet, but is being worked on.
### Log Queries ### Log Queries

View File

@@ -0,0 +1,39 @@
+++
title = "Logs Panel"
keywords = ["grafana", "dashboard", "documentation", "panels", "logs panel"]
type = "docs"
aliases = ["/reference/logs/"]
[menu.docs]
name = "Logs"
parent = "panels"
weight = 2
+++
# Logs Panel
<img class="screenshot" src="/assets/img/features/logs-panel.png">
> Logs panel is only available in Grafana v6.4+
The logs panel shows log lines from datasources that support logs, e.g., Elastic, Influx, and Loki.
Typically you would use this panel next to a graph panel to display the log output of a related process.
## Querying Data
The logs panel will show the result of queries that are specified in the **Queries** tab.
The results of multiple queries will be merged and sorted by time.
Note that you can scroll inside the panel in case the datasource returns more lines than can be displayed at any one time.
### Query Options
To limit the number of lines rendered, you can use the queries-wide **Max data points** setting. If it is not set, the datasource will usually enforce a limit.
## Visualization Options
### Columns
1. **Time**: Show/hide the time column. This is the timestamp associated with the log line as reported from the datasource.
2. **Order**: Set to **Ascending** to show the oldest log lines first.
<div class="clearfix"></div>

View File

@@ -2,5 +2,5 @@
"npmClient": "yarn", "npmClient": "yarn",
"useWorkspaces": true, "useWorkspaces": true,
"packages": ["packages/*"], "packages": ["packages/*"],
"version": "6.4.0-pre" "version": "6.4.2"
} }

View File

@@ -3,7 +3,7 @@
"license": "Apache-2.0", "license": "Apache-2.0",
"private": true, "private": true,
"name": "grafana", "name": "grafana",
"version": "6.4.0-pre", "version": "6.4.2",
"repository": { "repository": {
"type": "git", "type": "git",
"url": "http://github.com/grafana/grafana.git" "url": "http://github.com/grafana/grafana.git"
@@ -52,7 +52,9 @@
"@types/redux-logger": "3.0.7", "@types/redux-logger": "3.0.7",
"@types/redux-mock-store": "1.0.1", "@types/redux-mock-store": "1.0.1",
"@types/reselect": "2.2.0", "@types/reselect": "2.2.0",
"@types/slate": "0.44.11", "@types/slate": "0.47.1",
"@types/slate-plain-serializer": "0.6.1",
"@types/slate-react": "0.22.5",
"@types/tinycolor2": "1.4.2", "@types/tinycolor2": "1.4.2",
"angular-mocks": "1.6.6", "angular-mocks": "1.6.6",
"autoprefixer": "9.5.0", "autoprefixer": "9.5.0",
@@ -121,6 +123,7 @@
"redux-mock-store": "1.5.3", "redux-mock-store": "1.5.3",
"regexp-replace-loader": "1.0.1", "regexp-replace-loader": "1.0.1",
"rimraf": "2.6.3", "rimraf": "2.6.3",
"rxjs-spy": "^7.5.1",
"sass-lint": "1.12.1", "sass-lint": "1.12.1",
"sass-loader": "7.1.0", "sass-loader": "7.1.0",
"sinon": "1.17.6", "sinon": "1.17.6",
@@ -193,6 +196,7 @@
}, },
"dependencies": { "dependencies": {
"@babel/polyfill": "7.2.5", "@babel/polyfill": "7.2.5",
"@grafana/slate-react": "0.22.9-grafana",
"@torkelo/react-select": "2.4.1", "@torkelo/react-select": "2.4.1",
"angular": "1.6.6", "angular": "1.6.6",
"angular-bindonce": "0.3.1", "angular-bindonce": "0.3.1",
@@ -243,10 +247,8 @@
"rst2html": "github:thoward/rst2html#990cb89", "rst2html": "github:thoward/rst2html#990cb89",
"rxjs": "6.4.0", "rxjs": "6.4.0",
"search-query-parser": "1.5.2", "search-query-parser": "1.5.2",
"slate": "0.33.8", "slate": "0.47.8",
"slate-plain-serializer": "0.5.41", "slate-plain-serializer": "0.7.10",
"slate-prism": "0.5.0",
"slate-react": "0.12.11",
"tether": "1.4.5", "tether": "1.4.5",
"tether-drop": "https://github.com/torkelo/drop/tarball/master", "tether-drop": "https://github.com/torkelo/drop/tarball/master",
"tinycolor2": "1.4.1", "tinycolor2": "1.4.1",

View File

@@ -2,7 +2,7 @@
"author": "Grafana Labs", "author": "Grafana Labs",
"license": "Apache-2.0", "license": "Apache-2.0",
"name": "@grafana/data", "name": "@grafana/data",
"version": "6.4.0-pre", "version": "6.4.2",
"description": "Grafana Data Library", "description": "Grafana Data Library",
"keywords": [ "keywords": [
"typescript" "typescript"

View File

@@ -82,7 +82,7 @@ describe('FieldCache', () => {
it('should get the first field with a duplicate name', () => { it('should get the first field with a duplicate name', () => {
const field = ext.getFieldByName('value'); const field = ext.getFieldByName('value');
expect(field!.name).toEqual('value'); expect(field!.name).toEqual('value');
expect(field!.values.toJSON()).toEqual([1, 2, 3]); expect(field!.values.toArray()).toEqual([1, 2, 3]);
}); });
it('should return index of the field', () => { it('should return index of the field', () => {

View File

@@ -1,13 +1,13 @@
import { import {
isDataFrame,
toLegacyResponseData,
isTableData,
toDataFrame,
guessFieldTypes,
guessFieldTypeFromValue, guessFieldTypeFromValue,
guessFieldTypes,
isDataFrame,
isTableData,
sortDataFrame, sortDataFrame,
toDataFrame,
toLegacyResponseData,
} from './processDataFrame'; } from './processDataFrame';
import { FieldType, TimeSeries, TableData, DataFrameDTO } from '../types/index'; import { DataFrameDTO, FieldType, TableData, TimeSeries } from '../types/index';
import { dateTime } from '../datetime/moment_wrapper'; import { dateTime } from '../datetime/moment_wrapper';
import { MutableDataFrame } from './MutableDataFrame'; import { MutableDataFrame } from './MutableDataFrame';
@@ -103,6 +103,34 @@ describe('toDataFrame', () => {
expect(norm.fields[2].type).toBe(FieldType.other); expect(norm.fields[2].type).toBe(FieldType.other);
expect(norm.fields[3].type).toBe(FieldType.time); // based on name expect(norm.fields[3].type).toBe(FieldType.time); // based on name
}); });
it('converts JSON document data to series', () => {
const input1 = {
datapoints: [
{
_id: 'W5rvjW0BKe0cA-E1aHvr',
_type: '_doc',
_index: 'logs-2019.10.02',
'@message': 'Deployed website',
'@timestamp': [1570044340458],
tags: ['deploy', 'website-01'],
description: 'Torkel deployed website',
coordinates: { latitude: 12, longitude: 121, level: { depth: 3, coolnes: 'very' } },
'unescaped-content': 'breaking <br /> the <br /> row',
},
],
filterable: true,
target: 'docs',
total: 206,
type: 'docs',
};
const dataFrame = toDataFrame(input1);
expect(dataFrame.fields[0].name).toBe(input1.target);
const v0 = dataFrame.fields[0].values;
expect(v0.length).toEqual(1);
expect(v0.get(0)).toEqual(input1.datapoints[0]);
});
}); });
describe('SerisData backwards compatibility', () => { describe('SerisData backwards compatibility', () => {
@@ -124,11 +152,13 @@ describe('SerisData backwards compatibility', () => {
const table = { const table = {
columns: [], columns: [],
rows: [], rows: [],
type: 'table',
}; };
const series = toDataFrame(table); const series = toDataFrame(table);
const roundtrip = toLegacyResponseData(series) as TableData; const roundtrip = toLegacyResponseData(series) as TableData;
expect(roundtrip.columns.length).toBe(0); expect(roundtrip.columns.length).toBe(0);
expect(roundtrip.type).toBe('table');
}); });
it('converts TableData to series and back again', () => { it('converts TableData to series and back again', () => {
@@ -176,6 +206,37 @@ describe('SerisData backwards compatibility', () => {
const names = table.columns.map(c => c.text); const names = table.columns.map(c => c.text);
expect(names).toEqual(['T', 'N', 'S']); expect(names).toEqual(['T', 'N', 'S']);
}); });
it('can convert TimeSeries to JSON document and back again', () => {
const timeseries = {
datapoints: [
{
_id: 'W5rvjW0BKe0cA-E1aHvr',
_type: '_doc',
_index: 'logs-2019.10.02',
'@message': 'Deployed website',
'@timestamp': [1570044340458],
tags: ['deploy', 'website-01'],
description: 'Torkel deployed website',
coordinates: { latitude: 12, longitude: 121, level: { depth: 3, coolnes: 'very' } },
'unescaped-content': 'breaking <br /> the <br /> row',
},
],
filterable: true,
target: 'docs',
total: 206,
type: 'docs',
};
const series = toDataFrame(timeseries);
expect(isDataFrame(timeseries)).toBeFalsy();
expect(isDataFrame(series)).toBeTruthy();
const roundtrip = toLegacyResponseData(series) as any;
expect(isDataFrame(roundtrip)).toBeFalsy();
expect(roundtrip.type).toBe('docs');
expect(roundtrip.target).toBe('docs');
expect(roundtrip.filterable).toBeTruthy();
});
}); });
describe('sorted DataFrame', () => { describe('sorted DataFrame', () => {
@@ -189,14 +250,14 @@ describe('sorted DataFrame', () => {
it('Should sort numbers', () => { it('Should sort numbers', () => {
const sorted = sortDataFrame(frame, 0, true); const sorted = sortDataFrame(frame, 0, true);
expect(sorted.length).toEqual(3); expect(sorted.length).toEqual(3);
expect(sorted.fields[0].values.toJSON()).toEqual([3, 2, 1]); expect(sorted.fields[0].values.toArray()).toEqual([3, 2, 1]);
expect(sorted.fields[1].values.toJSON()).toEqual(['c', 'b', 'a']); expect(sorted.fields[1].values.toArray()).toEqual(['c', 'b', 'a']);
}); });
it('Should sort strings', () => { it('Should sort strings', () => {
const sorted = sortDataFrame(frame, 1, true); const sorted = sortDataFrame(frame, 1, true);
expect(sorted.length).toEqual(3); expect(sorted.length).toEqual(3);
expect(sorted.fields[0].values.toJSON()).toEqual([3, 2, 1]); expect(sorted.fields[0].values.toArray()).toEqual([3, 2, 1]);
expect(sorted.fields[1].values.toJSON()).toEqual(['c', 'b', 'a']); expect(sorted.fields[1].values.toArray()).toEqual(['c', 'b', 'a']);
}); });
}); });

View File

@@ -127,6 +127,33 @@ function convertGraphSeriesToDataFrame(graphSeries: GraphSeriesXY): DataFrame {
}; };
} }
function convertJSONDocumentDataToDataFrame(timeSeries: TimeSeries): DataFrame {
const fields = [
{
name: timeSeries.target,
type: FieldType.other,
config: {
unit: timeSeries.unit,
filterable: (timeSeries as any).filterable,
},
values: new ArrayVector(),
},
];
for (const point of timeSeries.datapoints) {
fields[0].values.buffer.push(point);
}
return {
name: timeSeries.target,
labels: timeSeries.tags,
refId: timeSeries.target,
meta: { json: true },
fields,
length: timeSeries.datapoints.length,
};
}
// PapaParse Dynamic Typing regex: // PapaParse Dynamic Typing regex:
// https://github.com/mholt/PapaParse/blob/master/papaparse.js#L998 // https://github.com/mholt/PapaParse/blob/master/papaparse.js#L998
const NUMBER = /^\s*-?(\d*\.?\d+|\d+\.?\d*)(e[-+]?\d+)?\s*$/i; const NUMBER = /^\s*-?(\d*\.?\d+|\d+\.?\d*)(e[-+]?\d+)?\s*$/i;
@@ -241,6 +268,11 @@ export const toDataFrame = (data: any): DataFrame => {
return new MutableDataFrame(data as DataFrameDTO); return new MutableDataFrame(data as DataFrameDTO);
} }
// Handle legacy docs/json type
if (data.hasOwnProperty('type') && data.type === 'docs') {
return convertJSONDocumentDataToDataFrame(data);
}
if (data.hasOwnProperty('datapoints')) { if (data.hasOwnProperty('datapoints')) {
return convertTimeSeriesToDataFrame(data); return convertTimeSeriesToDataFrame(data);
} }
@@ -288,6 +320,16 @@ export const toLegacyResponseData = (frame: DataFrame): TimeSeries | TableData =
} }
} }
if (frame.meta && frame.meta.json) {
return {
alias: fields[0].name || frame.name,
target: fields[0].name || frame.name,
datapoints: fields[0].values.toArray(),
filterable: fields[0].config ? fields[0].config.filterable : undefined,
type: 'docs',
} as TimeSeries;
}
return { return {
columns: fields.map(f => { columns: fields.map(f => {
const { name, config } = f; const { name, config } = f;
@@ -299,6 +341,7 @@ export const toLegacyResponseData = (frame: DataFrame): TimeSeries | TableData =
} }
return { text: name }; return { text: name };
}), }),
type: 'table',
refId: frame.refId, refId: frame.refId,
meta: frame.meta, meta: frame.meta,
rows, rows,
@@ -401,7 +444,7 @@ export function toDataFrameDTO(data: DataFrame): DataFrameDTO {
name: f.name, name: f.name,
type: f.type, type: f.type,
config: f.config, config: f.config,
values: f.values.toJSON(), values: f.values.toArray(),
}; };
}); });

View File

@@ -2,4 +2,5 @@
import * as dateMath from './datemath'; import * as dateMath from './datemath';
import * as rangeUtil from './rangeutil'; import * as rangeUtil from './rangeutil';
export * from './moment_wrapper'; export * from './moment_wrapper';
export * from './timezones';
export { dateMath, rangeUtil }; export { dateMath, rangeUtil };

View File

@@ -0,0 +1,390 @@
// List taken from https://stackoverflow.com/questions/38399465/how-to-get-list-of-all-timezones-in-javascript
export const getTimeZoneGroups = () => {
const europeZones = [
'Europe/Amsterdam',
'Europe/Andorra',
'Europe/Astrakhan',
'Europe/Athens',
'Europe/Belgrade',
'Europe/Berlin',
'Europe/Brussels',
'Europe/Bucharest',
'Europe/Budapest',
'Europe/Chisinau',
'Europe/Copenhagen',
'Europe/Dublin',
'Europe/Gibraltar',
'Europe/Helsinki',
'Europe/Istanbul',
'Europe/Kaliningrad',
'Europe/Kiev',
'Europe/Kirov',
'Europe/Lisbon',
'Europe/London',
'Europe/Luxembourg',
'Europe/Madrid',
'Europe/Malta',
'Europe/Minsk',
'Europe/Monaco',
'Europe/Moscow',
'Europe/Oslo',
'Europe/Paris',
'Europe/Prague',
'Europe/Riga',
'Europe/Rome',
'Europe/Samara',
'Europe/Saratov',
'Europe/Simferopol',
'Europe/Sofia',
'Europe/Stockholm',
'Europe/Tallinn',
'Europe/Tirane',
'Europe/Ulyanovsk',
'Europe/Uzhgorod',
'Europe/Vienna',
'Europe/Vilnius',
'Europe/Volgograd',
'Europe/Warsaw',
'Europe/Zaporozhye',
'Europe/Zurich',
];
const africaZones = [
'Africa/Abidjan',
'Africa/Accra',
'Africa/Algiers',
'Africa/Bissau',
'Africa/Cairo',
'Africa/Casablanca',
'Africa/Ceuta',
'Africa/El_Aaiun',
'Africa/Johannesburg',
'Africa/Juba',
'Africa/Khartoum',
'Africa/Lagos',
'Africa/Maputo',
'Africa/Monrovia',
'Africa/Nairobi',
'Africa/Ndjamena',
'Africa/Sao_Tome',
'Africa/Tripoli',
'Africa/Tunis',
'Africa/Windhoek',
];
const asiaZones = [
'Asia/Almaty',
'Asia/Amman',
'Asia/Anadyr',
'Asia/Aqtau',
'Asia/Aqtobe',
'Asia/Ashgabat',
'Asia/Atyrau',
'Asia/Baghdad',
'Asia/Baku',
'Asia/Bangkok',
'Asia/Barnaul',
'Asia/Beirut',
'Asia/Bishkek',
'Asia/Brunei',
'Asia/Chita',
'Asia/Choibalsan',
'Asia/Colombo',
'Asia/Damascus',
'Asia/Dhaka',
'Asia/Dili',
'Asia/Dubai',
'Asia/Dushanbe',
'Asia/Famagusta',
'Asia/Gaza',
'Asia/Hebron',
'Asia/Ho_Chi_Minh',
'Asia/Hong_Kong',
'Asia/Hovd',
'Asia/Irkutsk',
'Asia/Jakarta',
'Asia/Jayapura',
'Asia/Jerusalem',
'Asia/Kabul',
'Asia/Kamchatka',
'Asia/Karachi',
'Asia/Kathmandu',
'Asia/Khandyga',
'Asia/Kolkata',
'Asia/Krasnoyarsk',
'Asia/Kuala_Lumpur',
'Asia/Kuching',
'Asia/Macau',
'Asia/Magadan',
'Asia/Makassar',
'Asia/Manila',
'Asia/Nicosia',
'Asia/Novokuznetsk',
'Asia/Novosibirsk',
'Asia/Omsk',
'Asia/Oral',
'Asia/Pontianak',
'Asia/Pyongyang',
'Asia/Qatar',
'Asia/Qostanay',
'Asia/Qyzylorda',
'Asia/Riyadh',
'Asia/Sakhalin',
'Asia/Samarkand',
'Asia/Seoul',
'Asia/Shanghai',
'Asia/Singapore',
'Asia/Srednekolymsk',
'Asia/Taipei',
'Asia/Tashkent',
'Asia/Tbilisi',
'Asia/Tehran',
'Asia/Thimphu',
'Asia/Tokyo',
'Asia/Tomsk',
'Asia/Ulaanbaatar',
'Asia/Urumqi',
'Asia/Ust-Nera',
'Asia/Vladivostok',
'Asia/Yakutsk',
'Asia/Yangon',
'Asia/Yekaterinburg',
'Asia/Yerevan',
];
const antarcticaZones = [
'Antarctica/Casey',
'Antarctica/Davis',
'Antarctica/DumontDUrville',
'Antarctica/Macquarie',
'Antarctica/Mawson',
'Antarctica/Palmer',
'Antarctica/Rothera',
'Antarctica/Syowa',
'Antarctica/Troll',
'Antarctica/Vostok',
];
const americaZones = [
'America/Adak',
'America/Anchorage',
'America/Araguaina',
'America/Argentina/Buenos_Aires',
'America/Argentina/Catamarca',
'America/Argentina/Cordoba',
'America/Argentina/Jujuy',
'America/Argentina/La_Rioja',
'America/Argentina/Mendoza',
'America/Argentina/Rio_Gallegos',
'America/Argentina/Salta',
'America/Argentina/San_Juan',
'America/Argentina/San_Luis',
'America/Argentina/Tucuman',
'America/Argentina/Ushuaia',
'America/Asuncion',
'America/Atikokan',
'America/Bahia',
'America/Bahia_Banderas',
'America/Barbados',
'America/Belem',
'America/Belize',
'America/Blanc-Sablon',
'America/Boa_Vista',
'America/Bogota',
'America/Boise',
'America/Cambridge_Bay',
'America/Campo_Grande',
'America/Cancun',
'America/Caracas',
'America/Cayenne',
'America/Chicago',
'America/Chihuahua',
'America/Costa_Rica',
'America/Creston',
'America/Cuiaba',
'America/Curacao',
'America/Danmarkshavn',
'America/Dawson',
'America/Dawson_Creek',
'America/Denver',
'America/Detroit',
'America/Edmonton',
'America/Eirunepe',
'America/El_Salvador',
'America/Fort_Nelson',
'America/Fortaleza',
'America/Glace_Bay',
'America/Godthab',
'America/Goose_Bay',
'America/Grand_Turk',
'America/Guatemala',
'America/Guayaquil',
'America/Guyana',
'America/Halifax',
'America/Havana',
'America/Hermosillo',
'America/Indiana/Indianapolis',
'America/Indiana/Knox',
'America/Indiana/Marengo',
'America/Indiana/Petersburg',
'America/Indiana/Tell_City',
'America/Indiana/Vevay',
'America/Indiana/Vincennes',
'America/Indiana/Winamac',
'America/Inuvik',
'America/Iqaluit',
'America/Jamaica',
'America/Juneau',
'America/Kentucky/Louisville',
'America/Kentucky/Monticello',
'America/La_Paz',
'America/Lima',
'America/Los_Angeles',
'America/Maceio',
'America/Managua',
'America/Manaus',
'America/Martinique',
'America/Matamoros',
'America/Mazatlan',
'America/Menominee',
'America/Merida',
'America/Metlakatla',
'America/Mexico_City',
'America/Miquelon',
'America/Moncton',
'America/Monterrey',
'America/Montevideo',
'America/Nassau',
'America/New_York',
'America/Nipigon',
'America/Nome',
'America/Noronha',
'America/North_Dakota/Beulah',
'America/North_Dakota/Center',
'America/North_Dakota/New_Salem',
'America/Ojinaga',
'America/Panama',
'America/Pangnirtung',
'America/Paramaribo',
'America/Phoenix',
'America/Port-au-Prince',
'America/Port_of_Spain',
'America/Porto_Velho',
'America/Puerto_Rico',
'America/Punta_Arenas',
'America/Rainy_River',
'America/Rankin_Inlet',
'America/Recife',
'America/Regina',
'America/Resolute',
'America/Rio_Branco',
'America/Santarem',
'America/Santiago',
'America/Santo_Domingo',
'America/Sao_Paulo',
'America/Scoresbysund',
'America/Sitka',
'America/St_Johns',
'America/Swift_Current',
'America/Tegucigalpa',
'America/Thule',
'America/Thunder_Bay',
'America/Tijuana',
'America/Toronto',
'America/Vancouver',
'America/Whitehorse',
'America/Winnipeg',
'America/Yakutat',
'America/Yellowknife',
];
const pacificZones = [
'Pacific/Apia',
'Pacific/Auckland',
'Pacific/Bougainville',
'Pacific/Chatham',
'Pacific/Chuuk',
'Pacific/Easter',
'Pacific/Efate',
'Pacific/Enderbury',
'Pacific/Fakaofo',
'Pacific/Fiji',
'Pacific/Funafuti',
'Pacific/Galapagos',
'Pacific/Gambier',
'Pacific/Guadalcanal',
'Pacific/Guam',
'Pacific/Honolulu',
'Pacific/Kiritimati',
'Pacific/Kosrae',
'Pacific/Kwajalein',
'Pacific/Majuro',
'Pacific/Marquesas',
'Pacific/Nauru',
'Pacific/Niue',
'Pacific/Norfolk',
'Pacific/Noumea',
'Pacific/Pago_Pago',
'Pacific/Palau',
'Pacific/Pitcairn',
'Pacific/Pohnpei',
'Pacific/Port_Moresby',
'Pacific/Rarotonga',
'Pacific/Tahiti',
'Pacific/Tarawa',
'Pacific/Tongatapu',
'Pacific/Wake',
'Pacific/Wallis',
];
const australiaZones = [
'Australia/Adelaide',
'Australia/Brisbane',
'Australia/Broken_Hill',
'Australia/Currie',
'Australia/Darwin',
'Australia/Eucla',
'Australia/Hobart',
'Australia/Lindeman',
'Australia/Lord_Howe',
'Australia/Melbourne',
'Australia/Perth',
'Australia/Sydney',
];
const atlanticZones = [
'Atlantic/Azores',
'Atlantic/Bermuda',
'Atlantic/Canary',
'Atlantic/Cape_Verde',
'Atlantic/Faroe',
'Atlantic/Madeira',
'Atlantic/Reykjavik',
'Atlantic/South_Georgia',
'Atlantic/Stanley',
];
const indianZones = [
'Indian/Chagos',
'Indian/Christmas',
'Indian/Cocos',
'Indian/Kerguelen',
'Indian/Mahe',
'Indian/Maldives',
'Indian/Mauritius',
'Indian/Reunion',
];
return [
{ label: 'Africa', options: africaZones },
{ label: 'America', options: americaZones },
{ label: 'Antarctica', options: antarcticaZones },
{ label: 'Asia', options: asiaZones },
{ label: 'Atlantic', options: atlanticZones },
{ label: 'Australia', options: australiaZones },
{ label: 'Europe', options: europeZones },
{ label: 'Indian', options: indianZones },
{ label: 'Pacific', options: pacificZones },
];
};

View File

@@ -47,6 +47,7 @@ export interface TableData extends QueryResultBase {
name?: string; name?: string;
columns: Column[]; columns: Column[];
rows: any[][]; rows: any[][];
type?: string;
} }
export type TimeSeriesValue = number | null; export type TimeSeriesValue = number | null;

View File

@@ -12,3 +12,4 @@ export * from './displayValue';
export * from './graph'; export * from './graph';
export * from './ScopedVars'; export * from './ScopedVars';
export * from './transformations'; export * from './transformations';
export * from './vector';

View File

@@ -41,3 +41,9 @@ export interface TimeOptions {
export type TimeFragment = string | DateTime; export type TimeFragment = string | DateTime;
export const TIME_FORMAT = 'YYYY-MM-DD HH:mm:ss'; export const TIME_FORMAT = 'YYYY-MM-DD HH:mm:ss';
export const DefaultTimeRange: TimeRange = {
from: {} as DateTime,
to: {} as DateTime,
raw: { from: '6h', to: 'now' },
};

View File

@@ -10,11 +10,6 @@ export interface Vector<T = any> {
* Get the resutls as an array. * Get the resutls as an array.
*/ */
toArray(): T[]; toArray(): T[];
/**
* Return the values as a simple array for json serialization
*/
toJSON(): any; // same results as toArray()
} }
/** /**

View File

@@ -2,7 +2,7 @@
"author": "Grafana Labs", "author": "Grafana Labs",
"license": "Apache-2.0", "license": "Apache-2.0",
"name": "@grafana/runtime", "name": "@grafana/runtime",
"version": "6.4.0-pre", "version": "6.4.2",
"description": "Grafana Runtime Library", "description": "Grafana Runtime Library",
"keywords": [ "keywords": [
"grafana", "grafana",
@@ -21,8 +21,8 @@
"build": "grafana-toolkit package:build --scope=runtime" "build": "grafana-toolkit package:build --scope=runtime"
}, },
"dependencies": { "dependencies": {
"@grafana/data": "^6.4.0-alpha", "@grafana/data": "6.4.2",
"@grafana/ui": "^6.4.0-alpha", "@grafana/ui": "6.4.2",
"systemjs": "0.20.19", "systemjs": "0.20.19",
"systemjs-plugin-css": "0.1.37" "systemjs-plugin-css": "0.1.37"
}, },

View File

@@ -0,0 +1,4 @@
# 6.4.0 (unreleased)
# 6.4.0-beta1 (2019-09-17)
First release, see [Readme](https://github.com/grafana/grafana/blob/v6.4.0-beta1/packages/grafana-toolkit/README.md) for details.

View File

@@ -2,23 +2,27 @@
> **@grafana/toolkit is currently in ALPHA**. Core API is unstable and can be a subject of breaking changes! > **@grafana/toolkit is currently in ALPHA**. Core API is unstable and can be a subject of breaking changes!
# grafana-toolkit # grafana-toolkit
grafana-toolkit is CLI that enables efficient development of Grafana extensions grafana-toolkit is CLI that enables efficient development of Grafana plugins
## Rationale ## Rationale
Historically, creating Grafana extension was an exercise of reverse engineering and ceremony around testing, developing and eventually building the plugin. We want to help our community to focus on the core value of their plugins rather than all the setup required to develop an extension. Historically, creating Grafana plugin was an exercise of reverse engineering and ceremony around testing, developing and eventually building the plugin. We want to help our community to focus on the core value of their plugins rather than all the setup required to develop them.
## Installation ## Getting started
You can either add grafana-toolkit to your extension's `package.json` file by running Setup new plugin with `grafana-toolkit plugin:create` command:
`yarn add @grafana/toolkit` or `npm instal @grafana/toolkit`, or use one of our extension templates:
- [React Panel](https://github.com/grafana/simple-react-panel)
- [Angular Panel](https://github.com/grafana/simple-angular-panel)
### Updating your extension to use grafana-toolkit ```sh
In order to start using grafana-toolkit in your extension you need to follow the steps below: npx grafana-toolkit plugin:create my-grafana-plugin
1. Add `@grafana/toolkit` package to your project cd my-grafana-plugin
2. Create `tsconfig.json` file in the root dir of your extension and paste the code below: yarn install
yarn dev
```
### Updating your plugin to use grafana-toolkit
In order to start using grafana-toolkit in your existing plugin you need to follow the steps below:
1. Add `@grafana/toolkit` package to your project by running `yarn add @grafana/toolkit` or `npm install @grafana/toolkit`
2. Create `tsconfig.json` file in the root dir of your plugin and paste the code below:
```json ```json
{ {
"extends": "./node_modules/@grafana/toolkit/src/config/tsconfig.plugin.json", "extends": "./node_modules/@grafana/toolkit/src/config/tsconfig.plugin.json",
@@ -31,7 +35,7 @@ In order to start using grafana-toolkit in your extension you need to follow the
} }
``` ```
3. Create `.prettierrc.js` file in the root dir of your extension and paste the code below: 3. Create `.prettierrc.js` file in the root dir of your plugin and paste the code below:
```js ```js
module.exports = { module.exports = {
...require("./node_modules/@grafana/toolkit/src/config/prettier.plugin.config.json"), ...require("./node_modules/@grafana/toolkit/src/config/prettier.plugin.config.json"),
@@ -49,13 +53,21 @@ module.exports = {
``` ```
## Usage ## Usage
With grafana-toolkit we put in your hands a CLI that addresses common tasks performed when working on Grafana extension: With grafana-toolkit we put in your hands a CLI that addresses common tasks performed when working on Grafana plugin:
- `grafana-toolkit plugin:test` - `grafana-toolkit plugin:create`
- `grafana-toolkit plugin:dev` - `grafana-toolkit plugin:dev`
- `grafana-toolkit plugin:test`
- `grafana-toolkit plugin:build` - `grafana-toolkit plugin:build`
### Developing extensions ### Creating plugin
`grafana-toolkit plugin:create plugin-name`
Creates new Grafana plugin from template.
If `plugin-name` is provided, the template will be downloaded to `./plugin-name` directory. Otherwise, it will be downloaded to current directory.
### Developing plugin
`grafana-toolkit plugin:dev` `grafana-toolkit plugin:dev`
Creates development build that's easy to play with and debug using common browser tooling Creates development build that's easy to play with and debug using common browser tooling
@@ -63,7 +75,7 @@ Creates development build that's easy to play with and debug using common browse
Available options: Available options:
- `-w`, `--watch` - run development task in a watch mode - `-w`, `--watch` - run development task in a watch mode
### Testing extensions ### Testing plugin
`grafana-toolkit plugin:test` `grafana-toolkit plugin:test`
Runs Jest against your codebase Runs Jest against your codebase
@@ -76,26 +88,29 @@ Available options:
- `--testPathPattern=<regex>` - runs test with paths that match provided regex (https://jestjs.io/docs/en/cli#testpathpattern-regex) - `--testPathPattern=<regex>` - runs test with paths that match provided regex (https://jestjs.io/docs/en/cli#testpathpattern-regex)
### Building extensions ### Building plugin
`grafana-toolkit plugin:build` `grafana-toolkit plugin:build`
Creates production ready build of your extension Creates production ready build of your plugin
## FAQ ## FAQ
### Which version should I use?
Please refer to [Grafana packages versioning guide](https://github.com/grafana/grafana/blob/master/packages/README.md#versioning)
### What tools does grafana-toolkit use? ### What tools does grafana-toolkit use?
grafana-toolkit comes with Typescript, TSLint, Prettier, Jest, CSS and SASS support. grafana-toolkit comes with Typescript, TSLint, Prettier, Jest, CSS and SASS support.
### How to start using grafana-toolkit in my extension? ### How to start using grafana-toolkit in my plugin?
See [Updating your extension to use grafana-toolkit](#updating-your-extension-to-use-grafana-toolkit) See [Updating your plugin to use grafana-toolkit](#updating-your-plugin-to-use-grafana-toolkit)
### Can I use Typescript to develop Grafana extensions?
### Can I use Typescript to develop Grafana plugins?
Yes! grafana-toolkit supports Typescript by default. Yes! grafana-toolkit supports Typescript by default.
### How can I test my extension? ### How can I test my plugin?
grafana-toolkit comes with Jest as a test runner. grafana-toolkit comes with Jest as a test runner.
Internally at Grafana we use Enzyme. If you are developing React extension and you want to configure Enzyme as a testing utility, you need to configure `enzyme-adapter-react`. To do so create `<YOUR_EXTENSION>/config/jest-setup.ts` file that will provide necessary setup. Copy the following code into that file to get Enzyme working with React: Internally at Grafana we use Enzyme. If you are developing React plugin and you want to configure Enzyme as a testing utility, you need to configure `enzyme-adapter-react`. To do so create `<YOUR_PLUGIN_DIR>/config/jest-setup.ts` file that will provide necessary setup. Copy the following code into that file to get Enzyme working with React:
```ts ```ts
import { configure } from 'enzyme'; import { configure } from 'enzyme';
@@ -104,7 +119,7 @@ import Adapter from 'enzyme-adapter-react-16';
configure({ adapter: new Adapter() }); configure({ adapter: new Adapter() });
``` ```
You can also setup Jest with shims of your needs by creating `jest-shim.ts` file in the same directory: `<YOUR_EXTENSION>/config/jest-shim.ts` You can also setup Jest with shims of your needs by creating `jest-shim.ts` file in the same directory: `<YOUR_PLUGIN_DIR_>/config/jest-shim.ts`
### Can I provide custom setup for Jest? ### Can I provide custom setup for Jest?
@@ -114,7 +129,7 @@ Currently we support following Jest config properties:
- [`snapshotSerializers`](https://jest-bot.github.io/jest/docs/configuration.html#snapshotserializers-array-string) - [`snapshotSerializers`](https://jest-bot.github.io/jest/docs/configuration.html#snapshotserializers-array-string)
- [`moduleNameMapper`](https://jestjs.io/docs/en/configuration#modulenamemapper-object-string-string) - [`moduleNameMapper`](https://jestjs.io/docs/en/configuration#modulenamemapper-object-string-string)
### How can I style my extension? ### How can I style my plugin?
We support pure CSS, SASS and CSS-in-JS approach (via [Emotion](https://emotion.sh/)). We support pure CSS, SASS and CSS-in-JS approach (via [Emotion](https://emotion.sh/)).
#### Single CSS or SASS file #### Single CSS or SASS file
@@ -132,18 +147,18 @@ The styles will be injected via `style` tag during runtime.
If you want to provide different stylesheets for dark/light theme, create `dark.[css|scss]` and `light.[css|scss]` files in `src/styles` directory of your plugin. grafana-toolkit will generate theme specific stylesheets that will end up in `dist/styles` directory. If you want to provide different stylesheets for dark/light theme, create `dark.[css|scss]` and `light.[css|scss]` files in `src/styles` directory of your plugin. grafana-toolkit will generate theme specific stylesheets that will end up in `dist/styles` directory.
In order for Grafana to pickup up you theme stylesheets you need to use `loadPluginCss` from `@grafana/runtime` package. Typically you would do that in the entrypoint of your extension: In order for Grafana to pickup up you theme stylesheets you need to use `loadPluginCss` from `@grafana/runtime` package. Typically you would do that in the entrypoint of your plugin:
```ts ```ts
import { loadPluginCss } from '@grafana/runtime'; import { loadPluginCss } from '@grafana/runtime';
loadPluginCss({ loadPluginCss({
dark: 'plugins/<YOUR-EXTENSION-NAME>/styles/dark.css', dark: 'plugins/<YOUR-PLUGIN-ID>/styles/dark.css',
light: 'plugins/<YOUR-EXTENSION-NAME>/styles/light.css', light: 'plugins/<YOUR-PLUGIN-ID>/styles/light.css',
}); });
``` ```
You need to add `@grafana/runtime` to your extension dependencies by running `yarn add @grafana/runtime` or `npm instal @grafana/runtime` You need to add `@grafana/runtime` to your plugin dependencies by running `yarn add @grafana/runtime` or `npm instal @grafana/runtime`
> Note that in this case static files (png, svg, json, html) are all copied to dist directory when the plugin is bundled. Relative paths to those files does not change! > Note that in this case static files (png, svg, json, html) are all copied to dist directory when the plugin is bundled. Relative paths to those files does not change!
@@ -194,7 +209,7 @@ grafana-toolkit comes with [default config for TSLint](https://github.com/grafan
### How is Prettier integrated into grafana-toolkit workflow? ### How is Prettier integrated into grafana-toolkit workflow?
When building extension with [`grafana-toolkit plugin:build`](#building-extensions) task, grafana-toolkit performs Prettier check. If the check detects any Prettier issues, the build will not pass. To avoid such situation we suggest developing plugin with [`grafana-toolkit plugin:dev --watch`](#developing-extensions) task running. This task tries to fix Prettier issues automatically. When building plugin with [`grafana-toolkit plugin:build`](#building-plugin) task, grafana-toolkit performs Prettier check. If the check detects any Prettier issues, the build will not pass. To avoid such situation we suggest developing plugin with [`grafana-toolkit plugin:dev --watch`](#developing-plugin) task running. This task tries to fix Prettier issues automatically.
### My editor does not respect Prettier config, what should I do? ### My editor does not respect Prettier config, what should I do?
In order for your editor to pickup our Prettier config you need to create `.prettierrc.js` file in the root directory of your plugin with following content: In order for your editor to pickup our Prettier config you need to create `.prettierrc.js` file in the root directory of your plugin with following content:

View File

@@ -11,4 +11,5 @@ require('ts-node').register({
transpileOnly: true transpileOnly: true
}); });
require('../src/cli/index.ts').run(true); require('../src/cli/index.ts').run(true);

View File

@@ -2,7 +2,7 @@
"author": "Grafana Labs", "author": "Grafana Labs",
"license": "Apache-2.0", "license": "Apache-2.0",
"name": "@grafana/toolkit", "name": "@grafana/toolkit",
"version": "6.4.0-pre", "version": "6.4.2",
"description": "Grafana Toolkit", "description": "Grafana Toolkit",
"keywords": [ "keywords": [
"grafana", "grafana",
@@ -28,6 +28,9 @@
"dependencies": { "dependencies": {
"@babel/core": "7.4.5", "@babel/core": "7.4.5",
"@babel/preset-env": "7.4.5", "@babel/preset-env": "7.4.5",
"@grafana/data": "6.4.2",
"@grafana/ui": "6.4.2",
"@types/command-exists": "^1.2.0",
"@types/execa": "^0.9.0", "@types/execa": "^0.9.0",
"@types/expect-puppeteer": "3.3.1", "@types/expect-puppeteer": "3.3.1",
"@types/inquirer": "^6.0.3", "@types/inquirer": "^6.0.3",
@@ -40,12 +43,11 @@
"@types/tmp": "^0.1.0", "@types/tmp": "^0.1.0",
"@types/webpack": "4.4.34", "@types/webpack": "4.4.34",
"aws-sdk": "^2.495.0", "aws-sdk": "^2.495.0",
"@grafana/data": "^6.4.0-alpha",
"@grafana/ui": "^6.4.0-alpha",
"axios": "0.19.0", "axios": "0.19.0",
"babel-loader": "8.0.6", "babel-loader": "8.0.6",
"babel-plugin-angularjs-annotate": "0.10.0", "babel-plugin-angularjs-annotate": "0.10.0",
"chalk": "^2.4.2", "chalk": "^2.4.2",
"command-exists": "^1.2.8",
"commander": "^2.20.0", "commander": "^2.20.0",
"concurrently": "4.1.0", "concurrently": "4.1.0",
"copy-webpack-plugin": "5.0.3", "copy-webpack-plugin": "5.0.3",
@@ -98,6 +100,5 @@
}, },
"_moduleAliases": { "_moduleAliases": {
"puppeteer": "node_modules/puppeteer-core" "puppeteer": "node_modules/puppeteer-core"
}, }
"types": "src/index.ts"
} }

View File

@@ -21,6 +21,7 @@ import {
ciPluginReportTask, ciPluginReportTask,
} from './tasks/plugin.ci'; } from './tasks/plugin.ci';
import { buildPackageTask } from './tasks/package.build'; import { buildPackageTask } from './tasks/package.build';
import { pluginCreateTask } from './tasks/plugin.create';
export const run = (includeInternalScripts = false) => { export const run = (includeInternalScripts = false) => {
if (includeInternalScripts) { if (includeInternalScripts) {
@@ -61,6 +62,7 @@ export const run = (includeInternalScripts = false) => {
await execTask(changelogTask)({ await execTask(changelogTask)({
milestone: cmd.milestone, milestone: cmd.milestone,
silent: true,
}); });
}); });
@@ -89,8 +91,7 @@ export const run = (includeInternalScripts = false) => {
.command('toolkit:build') .command('toolkit:build')
.description('Prepares grafana/toolkit dist package') .description('Prepares grafana/toolkit dist package')
.action(async cmd => { .action(async cmd => {
// @ts-ignore await execTask(toolkitBuildTask)({});
await execTask(toolkitBuildTask)();
}); });
program program
@@ -117,11 +118,18 @@ export const run = (includeInternalScripts = false) => {
}); });
} }
program
.command('plugin:create [name]')
.description('Creates plugin from template')
.action(async cmd => {
await execTask(pluginCreateTask)({ name: cmd, silent: true });
});
program program
.command('plugin:build') .command('plugin:build')
.description('Prepares plugin dist package') .description('Prepares plugin dist package')
.action(async cmd => { .action(async cmd => {
await execTask(pluginBuildTask)({ coverage: false }); await execTask(pluginBuildTask)({ coverage: false, silent: true });
}); });
program program
@@ -133,6 +141,7 @@ export const run = (includeInternalScripts = false) => {
await execTask(pluginDevTask)({ await execTask(pluginDevTask)({
watch: !!cmd.watch, watch: !!cmd.watch,
yarnlink: !!cmd.yarnlink, yarnlink: !!cmd.yarnlink,
silent: true,
}); });
}); });
@@ -151,6 +160,7 @@ export const run = (includeInternalScripts = false) => {
watch: !!cmd.watch, watch: !!cmd.watch,
testPathPattern: cmd.testPathPattern, testPathPattern: cmd.testPathPattern,
testNamePattern: cmd.testNamePattern, testNamePattern: cmd.testNamePattern,
silent: true,
}); });
}); });

View File

@@ -2,67 +2,84 @@
import * as _ from 'lodash'; import * as _ from 'lodash';
import { Task, TaskRunner } from './task'; import { Task, TaskRunner } from './task';
import GithubClient from '../utils/githubClient'; import GithubClient from '../utils/githubClient';
import difference from 'lodash/difference';
import chalk from 'chalk';
import { useSpinner } from '../utils/useSpinner';
interface ChangelogOptions { interface ChangelogOptions {
milestone: string; milestone: string;
} }
const changelogTaskRunner: TaskRunner<ChangelogOptions> = async ({ milestone }) => { const filterBugs = (item: any) => {
const githubClient = new GithubClient(); if (item.title.match(/fix|fixes/i)) {
const client = githubClient.client; return true;
}
if (item.labels.find((label: any) => label.name === 'type/bug')) {
return true;
}
return false;
};
if (!/^\d+$/.test(milestone)) { const getPackageChangelog = (packageName: string, issues: any[]) => {
console.log('Use milestone number not title, find number in milestone url'); if (issues.length === 0) {
return; return '';
} }
const res = await client.get('/issues', { let markdown = chalk.bold.yellow(`\n\n/*** ${packageName} changelog ***/\n\n`);
params: { const bugs = _.sortBy(issues.filter(filterBugs), 'title');
state: 'closed', const notBugs = _.sortBy(difference(issues, bugs), 'title');
per_page: 100,
labels: 'add to changelog',
milestone: milestone,
},
});
const issues = res.data;
const bugs = _.sortBy(
issues.filter((item: any) => {
if (item.title.match(/fix|fixes/i)) {
return true;
}
if (item.labels.find((label: any) => label.name === 'type/bug')) {
return true;
}
return false;
}),
'title'
);
const notBugs = _.sortBy(issues.filter((item: any) => !bugs.find((bug: any) => bug === item)), 'title');
let markdown = '';
if (notBugs.length > 0) { if (notBugs.length > 0) {
markdown = '### Features / Enhancements\n'; markdown += '### Features / Enhancements\n';
} for (const item of notBugs) {
markdown += getMarkdownLineForIssue(item);
for (const item of notBugs) { }
markdown += getMarkdownLineForIssue(item);
} }
if (bugs.length > 0) { if (bugs.length > 0) {
markdown += '\n### Bug Fixes\n'; markdown += '\n### Bug Fixes\n';
for (const item of bugs) {
markdown += getMarkdownLineForIssue(item);
}
} }
for (const item of bugs) { return markdown;
markdown += getMarkdownLineForIssue(item);
}
console.log(markdown);
}; };
const changelogTaskRunner: TaskRunner<ChangelogOptions> = useSpinner<ChangelogOptions>(
'Generating changelog',
async ({ milestone }) => {
const githubClient = new GithubClient();
const client = githubClient.client;
if (!/^\d+$/.test(milestone)) {
console.log('Use milestone number not title, find number in milestone url');
return;
}
const res = await client.get('/issues', {
params: {
state: 'closed',
per_page: 100,
labels: 'add to changelog',
milestone: milestone,
},
});
const issues = res.data;
const toolkitIssues = issues.filter((item: any) =>
item.labels.find((label: any) => label.name === 'area/grafana/toolkit')
);
let markdown = '';
markdown += getPackageChangelog('Grafana', issues);
markdown += getPackageChangelog('grafana-toolkit', toolkitIssues);
console.log(markdown);
}
);
function getMarkdownLineForIssue(item: any) { function getMarkdownLineForIssue(item: any) {
const githubGrafanaUrl = 'https://github.com/grafana/grafana'; const githubGrafanaUrl = 'https://github.com/grafana/grafana';
let markdown = ''; let markdown = '';

View File

@@ -10,7 +10,10 @@ const cherryPickRunner: TaskRunner<CherryPickOptions> = async () => {
const res = await client.get('/issues', { const res = await client.get('/issues', {
params: { params: {
state: 'closed', state: 'closed',
per_page: 100,
labels: 'cherry-pick needed', labels: 'cherry-pick needed',
sort: 'closed',
direction: 'asc',
}, },
}); });

View File

@@ -0,0 +1,47 @@
import { prompt } from 'inquirer';
import path from 'path';
import { Task, TaskRunner } from './task';
import { promptConfirm } from '../utils/prompt';
import {
getPluginIdFromName,
verifyGitExists,
promptPluginType,
fetchTemplate,
promptPluginDetails,
formatPluginDetails,
prepareJsonFiles,
removeGitFiles,
} from './plugin/create';
interface PluginCreateOptions {
name?: string;
}
const pluginCreateRunner: TaskRunner<PluginCreateOptions> = async ({ name }) => {
const destPath = path.resolve(process.cwd(), getPluginIdFromName(name || ''));
let pluginDetails;
// 1. Verifying if git exists in user's env as templates are cloned from git templates
await verifyGitExists();
// 2. Prompt plugin template
const { type } = await promptPluginType();
// 3. Fetch plugin template from Github
await fetchTemplate({ type, dest: destPath });
// 4. Prompt plugin details
do {
pluginDetails = await promptPluginDetails(name);
formatPluginDetails(pluginDetails);
} while ((await prompt<{ confirm: boolean }>(promptConfirm('confirm', 'Is that ok?'))).confirm === false);
// 5. Update json files (package.json, src/plugin.json)
await prepareJsonFiles({ pluginDetails, pluginPath: destPath });
// 6. Remove cloned repository .git dir
await removeGitFiles(destPath);
};
export const pluginCreateTask = new Task<PluginCreateOptions>('plugin:create task', pluginCreateRunner);

View File

@@ -0,0 +1,150 @@
import commandExists from 'command-exists';
import { readFileSync, promises as fs } from 'fs';
import { prompt } from 'inquirer';
import kebabCase from 'lodash/kebabCase';
import path from 'path';
import gitPromise from 'simple-git/promise';
import { useSpinner } from '../../utils/useSpinner';
import { rmdir } from '../../utils/rmdir';
import { promptInput, promptConfirm } from '../../utils/prompt';
import chalk from 'chalk';
const simpleGit = gitPromise(process.cwd());
interface PluginDetails {
name: string;
org: string;
description: string;
author: boolean | string;
url: string;
keywords: string;
}
type PluginType = 'angular-panel' | 'react-panel' | 'datasource-plugin';
const RepositoriesPaths = {
'angular-panel': 'git@github.com:grafana/simple-angular-panel.git',
'react-panel': 'git@github.com:grafana/simple-react-panel.git',
'datasource-plugin': 'git@github.com:grafana/simple-datasource.git',
};
export const getGitUsername = async () => await simpleGit.raw(['config', '--global', 'user.name']);
export const getPluginIdFromName = (name: string) => kebabCase(name);
export const getPluginId = (pluginDetails: PluginDetails) =>
`${kebabCase(pluginDetails.org)}-${getPluginIdFromName(pluginDetails.name)}`;
export const getPluginKeywords = (pluginDetails: PluginDetails) =>
pluginDetails.keywords
.split(',')
.map(k => k.trim())
.filter(k => k !== '');
export const verifyGitExists = async () => {
return new Promise((resolve, reject) => {
commandExists('git', (err, exists) => {
if (exists) {
resolve(true);
}
reject(new Error('git is not installed'));
});
});
};
export const promptPluginType = async () =>
prompt<{ type: PluginType }>([
{
type: 'list',
message: 'Select plugin type',
name: 'type',
choices: [
{ name: 'Angular panel', value: 'angular-panel' },
{ name: 'React panel', value: 'react-panel' },
{ name: 'Datasource plugin', value: 'datasource-plugin' },
],
},
]);
export const promptPluginDetails = async (name?: string) => {
const username = (await getGitUsername()).trim();
const responses = await prompt<PluginDetails>([
promptInput('name', 'Plugin name', true, name),
promptInput('org', 'Organization (used as part of plugin ID)', true),
promptInput('description', 'Description'),
promptInput('keywords', 'Keywords (separated by comma)'),
// Try using git specified username
promptConfirm('author', `Author (${username})`, username, username !== ''),
// Prompt for manual author entry if no git user.name specifed
promptInput('author', `Author`, true, undefined, answers => !answers.author || username === ''),
promptInput('url', 'Your URL (i.e. organisation url)'),
]);
return {
...responses,
author: responses.author === true ? username : responses.author,
};
};
export const fetchTemplate = useSpinner<{ type: PluginType; dest: string }>(
'Fetching plugin template...',
async ({ type, dest }) => {
const url = RepositoriesPaths[type];
if (!url) {
throw new Error('Unknown plugin type');
}
await simpleGit.clone(url, dest);
}
);
export const prepareJsonFiles = useSpinner<{ pluginDetails: PluginDetails; pluginPath: string }>(
'Saving package.json and plugin.json files',
async ({ pluginDetails, pluginPath }) => {
const packageJsonPath = path.resolve(pluginPath, 'package.json');
const pluginJsonPath = path.resolve(pluginPath, 'src/plugin.json');
const packageJson: any = JSON.parse(readFileSync(packageJsonPath, 'utf8'));
const pluginJson: any = JSON.parse(readFileSync(pluginJsonPath, 'utf8'));
const pluginId = `${kebabCase(pluginDetails.org)}-${getPluginIdFromName(pluginDetails.name)}`;
packageJson.name = pluginId;
packageJson.author = pluginDetails.author;
packageJson.description = pluginDetails.description;
pluginJson.name = pluginDetails.name;
pluginJson.id = pluginId;
pluginJson.info = {
...pluginJson.info,
description: pluginDetails.description,
author: {
name: pluginDetails.author,
url: pluginDetails.url,
},
keywords: getPluginKeywords(pluginDetails),
};
await Promise.all(
[packageJson, pluginJson].map((f, i) => {
const filePath = i === 0 ? packageJsonPath : pluginJsonPath;
return fs.writeFile(filePath, JSON.stringify(f, null, 2));
})
);
}
);
export const removeGitFiles = useSpinner('Cleaning', async pluginPath => rmdir(`${path.resolve(pluginPath, '.git')}`));
export const formatPluginDetails = (details: PluginDetails) => {
console.group();
console.log();
console.log(chalk.bold.yellow('Your plugin details'));
console.log('---');
console.log(chalk.bold('Name: '), details.name);
console.log(chalk.bold('ID: '), getPluginId(details));
console.log(chalk.bold('Description: '), details.description);
console.log(chalk.bold('Keywords: '), getPluginKeywords(details));
console.log(chalk.bold('Author: '), details.author);
console.log(chalk.bold('Organisation: '), details.org);
console.log(chalk.bold('Website: '), details.url);
console.log();
console.groupEnd();
};

View File

@@ -3,7 +3,6 @@ import * as fs from 'fs';
import chalk from 'chalk'; import chalk from 'chalk';
import { useSpinner } from '../utils/useSpinner'; import { useSpinner } from '../utils/useSpinner';
import { Task, TaskRunner } from './task'; import { Task, TaskRunner } from './task';
import escapeRegExp from 'lodash/escapeRegExp';
const path = require('path'); const path = require('path');
@@ -105,7 +104,9 @@ const copySassFiles = () => {
})(); })();
}; };
const toolkitBuildTaskRunner: TaskRunner<void> = async () => { interface ToolkitBuildOptions {}
const toolkitBuildTaskRunner: TaskRunner<ToolkitBuildOptions> = async () => {
cwd = path.resolve(__dirname, '../../../'); cwd = path.resolve(__dirname, '../../../');
distDir = `${cwd}/dist`; distDir = `${cwd}/dist`;
const pkg = require(`${cwd}/package.json`); const pkg = require(`${cwd}/package.json`);
@@ -118,21 +119,6 @@ const toolkitBuildTaskRunner: TaskRunner<void> = async () => {
fs.mkdirSync('./dist/sass'); fs.mkdirSync('./dist/sass');
await copyFiles(); await copyFiles();
await copySassFiles(); await copySassFiles();
// RYAN HACK HACK HACK
// when Dominik is back from vacation, we can find a better way
// This moves the index to the root so plugin e2e tests can import them
console.warn('hacking an index.js file for toolkit. Help!');
const index = `${distDir}/src/index.js`;
fs.readFile(index, 'utf8', (err, data) => {
const pattern = 'require("./';
const js = data.replace(new RegExp(escapeRegExp(pattern), 'g'), 'require("./src/');
fs.writeFile(`${distDir}/index.js`, js, err => {
if (err) {
throw new Error('Error writing index: ' + err);
}
});
});
}; };
export const toolkitBuildTask = new Task<void>('@grafana/toolkit build', toolkitBuildTaskRunner); export const toolkitBuildTask = new Task<ToolkitBuildOptions>('@grafana/toolkit build', toolkitBuildTaskRunner);

View File

@@ -1,8 +1,15 @@
import { Task } from '../tasks/task'; import { Task } from '../tasks/task';
import chalk from 'chalk'; import chalk from 'chalk';
export const execTask = <TOptions>(task: Task<TOptions>) => async (options: TOptions) => { interface TaskBasicOptions {
console.log(chalk.yellow(`Running ${chalk.bold(task.name)} task`)); // Don't print task details when running
silent?: boolean;
}
export const execTask = <TOptions>(task: Task<TOptions>) => async (options: TOptions & TaskBasicOptions) => {
if (!options.silent) {
console.log(chalk.yellow(`Running ${chalk.bold(task.name)} task`));
}
task.setOptions(options); task.setOptions(options);
try { try {
console.group(); console.group();

View File

@@ -0,0 +1,58 @@
import {
Question,
InputQuestion,
CheckboxQuestion,
NumberQuestion,
PasswordQuestion,
EditorQuestion,
ConfirmQuestion,
} from 'inquirer';
type QuestionWithValidation<A = any> =
| InputQuestion<A>
| CheckboxQuestion<A>
| NumberQuestion<A>
| PasswordQuestion<A>
| EditorQuestion<A>;
export const answerRequired = (question: QuestionWithValidation): Question<any> => {
return {
...question,
validate: (answer: any) => answer.trim() !== '' || `${question.name} is required`,
};
};
export const promptInput = <A>(
name: string,
message: string | ((answers: A) => string),
required = false,
def: any = undefined,
when: boolean | ((answers: A) => boolean | Promise<boolean>) = true
) => {
const model: InputQuestion<A> = {
type: 'input',
name,
message,
default: def,
when,
};
return required ? answerRequired(model) : model;
};
export const promptConfirm = <A>(
name: string,
message: string | ((answers: A) => string),
def: any = undefined,
when: boolean | ((answers: A) => boolean | Promise<boolean>) = true
) => {
const model: ConfirmQuestion<A> = {
type: 'confirm',
name,
message,
default: def,
when,
};
return model;
};

View File

@@ -0,0 +1,23 @@
import fs = require('fs');
import path = require('path');
/**
* Remove directory recursively
* Ref https://stackoverflow.com/a/42505874
*/
export const rmdir = (dirPath: string) => {
if (!fs.existsSync(dirPath)) {
return;
}
fs.readdirSync(dirPath).forEach(entry => {
const entryPath = path.join(dirPath, entry);
if (fs.lstatSync(entryPath).isDirectory()) {
rmdir(entryPath);
} else {
fs.unlinkSync(entryPath);
}
});
fs.rmdirSync(dirPath);
};

View File

@@ -2,7 +2,7 @@ import ora from 'ora';
type FnToSpin<T> = (options: T) => Promise<void>; type FnToSpin<T> = (options: T) => Promise<void>;
export const useSpinner = <T>(spinnerLabel: string, fn: FnToSpin<T>, killProcess = true) => { export const useSpinner = <T = any>(spinnerLabel: string, fn: FnToSpin<T>, killProcess = true) => {
return async (options: T) => { return async (options: T) => {
const spinner = ora(spinnerLabel); const spinner = ora(spinnerLabel);
spinner.start(); spinner.start();

View File

@@ -149,7 +149,7 @@ export const getWebpackConfig: WebpackConfigurationGetter = options => {
'emotion', 'emotion',
'prismjs', 'prismjs',
'slate-plain-serializer', 'slate-plain-serializer',
'slate-react', '@grafana/slate-react',
'react', 'react',
'react-dom', 'react-dom',
'react-redux', 'react-redux',

View File

@@ -1,6 +0,0 @@
export * from './e2e';
// Namespace for Plugins
import * as plugins from './plugins';
export { plugins };

View File

@@ -1,6 +1,7 @@
import { Browser, Page } from 'puppeteer-core'; import { Browser, Page } from 'puppeteer-core';
import { e2eScenario, takeScreenShot, plugins, pages } from '@grafana/toolkit'; import { e2eScenario, takeScreenShot, pages } from '@grafana/toolkit/src/e2e';
import { getEndToEndSettings } from '@grafana/toolkit/src/plugins';
// **************************************************************** // ****************************************************************
// NOTE, This file is copied to plugins at runtime, it is not run locally // NOTE, This file is copied to plugins at runtime, it is not run locally
@@ -11,7 +12,7 @@ const sleep = (milliseconds: number) => {
}; };
e2eScenario('Common Plugin Test', 'should pass', async (browser: Browser, page: Page) => { e2eScenario('Common Plugin Test', 'should pass', async (browser: Browser, page: Page) => {
const settings = plugins.getEndToEndSettings(); const settings = getEndToEndSettings();
const pluginPage = pages.getPluginPage(settings.plugin.id); const pluginPage = pages.getPluginPage(settings.plugin.id);
await pluginPage.init(page); await pluginPage.init(page);
await pluginPage.navigateTo(); await pluginPage.navigateTo();

View File

@@ -7,7 +7,7 @@
"rootDirs": ["."], "rootDirs": ["."],
"outDir": "dist/src", "outDir": "dist/src",
"declaration": true, "declaration": true,
"declarationDir": "dist", "declarationDir": "dist/src",
"typeRoots": ["./node_modules/@types"], "typeRoots": ["./node_modules/@types"],
"esModuleInterop": true, "esModuleInterop": true,
"lib": ["es2015", "es2017.string", "dom"] "lib": ["es2015", "es2017.string", "dom"]

View File

@@ -2,7 +2,7 @@
"author": "Grafana Labs", "author": "Grafana Labs",
"license": "Apache-2.0", "license": "Apache-2.0",
"name": "@grafana/ui", "name": "@grafana/ui",
"version": "6.4.0-pre", "version": "6.4.2",
"description": "Grafana Components Library", "description": "Grafana Components Library",
"keywords": [ "keywords": [
"grafana", "grafana",
@@ -25,11 +25,13 @@
"build": "grafana-toolkit package:build --scope=ui" "build": "grafana-toolkit package:build --scope=ui"
}, },
"dependencies": { "dependencies": {
"@grafana/data": "^6.4.0-alpha", "@grafana/data": "6.4.2",
"@grafana/slate-react": "0.22.9-grafana",
"@torkelo/react-select": "2.1.1", "@torkelo/react-select": "2.1.1",
"@types/react-color": "2.17.0", "@types/react-color": "2.17.0",
"classnames": "2.2.6", "classnames": "2.2.6",
"d3": "5.9.1", "d3": "5.9.1",
"immutable": "3.8.2",
"jquery": "3.4.1", "jquery": "3.4.1",
"lodash": "4.17.15", "lodash": "4.17.15",
"moment": "2.24.0", "moment": "2.24.0",
@@ -45,6 +47,7 @@
"react-storybook-addon-props-combinations": "1.1.0", "react-storybook-addon-props-combinations": "1.1.0",
"react-transition-group": "2.6.1", "react-transition-group": "2.6.1",
"react-virtualized": "9.21.0", "react-virtualized": "9.21.0",
"slate": "0.47.8",
"tinycolor2": "1.4.1" "tinycolor2": "1.4.1"
}, },
"devDependencies": { "devDependencies": {
@@ -65,6 +68,8 @@
"@types/react-custom-scrollbars": "4.0.5", "@types/react-custom-scrollbars": "4.0.5",
"@types/react-test-renderer": "16.8.1", "@types/react-test-renderer": "16.8.1",
"@types/react-transition-group": "2.0.16", "@types/react-transition-group": "2.0.16",
"@types/slate": "0.47.1",
"@types/slate-react": "0.22.5",
"@types/storybook__addon-actions": "3.4.2", "@types/storybook__addon-actions": "3.4.2",
"@types/storybook__addon-info": "4.1.1", "@types/storybook__addon-info": "4.1.1",
"@types/storybook__addon-knobs": "4.0.4", "@types/storybook__addon-knobs": "4.0.4",

View File

@@ -1,6 +1,6 @@
import resolve from 'rollup-plugin-node-resolve'; import resolve from 'rollup-plugin-node-resolve';
import commonjs from 'rollup-plugin-commonjs'; import commonjs from 'rollup-plugin-commonjs';
import sourceMaps from 'rollup-plugin-sourcemaps'; // import sourceMaps from 'rollup-plugin-sourcemaps';
import { terser } from 'rollup-plugin-terser'; import { terser } from 'rollup-plugin-terser';
const pkg = require('./package.json'); const pkg = require('./package.json');
@@ -47,19 +47,20 @@ const buildCjsPackage = ({ env }) => {
], ],
'../../node_modules/react-color/lib/components/common': ['Saturation', 'Hue', 'Alpha'], '../../node_modules/react-color/lib/components/common': ['Saturation', 'Hue', 'Alpha'],
'../../node_modules/immutable/dist/immutable.js': [ '../../node_modules/immutable/dist/immutable.js': [
'Record',
'Set', 'Set',
'Map', 'Map',
'List', 'List',
'OrderedSet', 'OrderedSet',
'is', 'is',
'Stack', 'Stack',
'Record',
], ],
'node_modules/immutable/dist/immutable.js': ['Record', 'Set', 'Map', 'List', 'OrderedSet', 'is', 'Stack'],
'../../node_modules/esrever/esrever.js': ['reverse'], '../../node_modules/esrever/esrever.js': ['reverse'],
}, },
}), }),
resolve(), resolve(),
sourceMaps(), // sourceMaps(),
env === 'production' && terser(), env === 'production' && terser(),
], ],
}; };

View File

@@ -19,7 +19,9 @@ export interface CommonButtonProps {
className?: string; className?: string;
} }
export interface LinkButtonProps extends CommonButtonProps, AnchorHTMLAttributes<HTMLAnchorElement> {} export interface LinkButtonProps extends CommonButtonProps, AnchorHTMLAttributes<HTMLAnchorElement> {
disabled?: boolean;
}
export interface ButtonProps extends CommonButtonProps, ButtonHTMLAttributes<HTMLButtonElement> {} export interface ButtonProps extends CommonButtonProps, ButtonHTMLAttributes<HTMLButtonElement> {}
interface AbstractButtonProps extends CommonButtonProps, Themeable { interface AbstractButtonProps extends CommonButtonProps, Themeable {

View File

@@ -11,7 +11,7 @@ interface DataLinkEditorProps {
isLast: boolean; isLast: boolean;
value: DataLink; value: DataLink;
suggestions: VariableSuggestion[]; suggestions: VariableSuggestion[];
onChange: (index: number, link: DataLink) => void; onChange: (index: number, link: DataLink, callback?: () => void) => void;
onRemove: (link: DataLink) => void; onRemove: (link: DataLink) => void;
} }
@@ -20,8 +20,8 @@ export const DataLinkEditor: React.FC<DataLinkEditorProps> = React.memo(
const theme = useContext(ThemeContext); const theme = useContext(ThemeContext);
const [title, setTitle] = useState(value.title); const [title, setTitle] = useState(value.title);
const onUrlChange = (url: string) => { const onUrlChange = (url: string, callback?: () => void) => {
onChange(index, { ...value, url }); onChange(index, { ...value, url }, callback);
}; };
const onTitleChange = (event: ChangeEvent<HTMLInputElement>) => { const onTitleChange = (event: ChangeEvent<HTMLInputElement>) => {
setTitle(event.target.value); setTitle(event.target.value);

View File

@@ -1,46 +1,39 @@
import React, { useState, useMemo, useCallback, useContext } from 'react'; import React, { useState, useMemo, useCallback, useContext, useRef, RefObject } from 'react';
import { VariableSuggestion, VariableOrigin, DataLinkSuggestions } from './DataLinkSuggestions'; import { VariableSuggestion, VariableOrigin, DataLinkSuggestions } from './DataLinkSuggestions';
import { makeValue, ThemeContext, DataLinkBuiltInVars } from '../../index'; import { ThemeContext, DataLinkBuiltInVars, makeValue } from '../../index';
import { SelectionReference } from './SelectionReference'; import { SelectionReference } from './SelectionReference';
import { Portal } from '../index'; import { Portal } from '../index';
// @ts-ignore
import { Editor } from 'slate-react'; import { Editor } from '@grafana/slate-react';
// @ts-ignore import { Value, Editor as CoreEditor } from 'slate';
import { Value, Change, Document } from 'slate';
// @ts-ignore
import Plain from 'slate-plain-serializer'; import Plain from 'slate-plain-serializer';
import { Popper as ReactPopper } from 'react-popper'; import { Popper as ReactPopper } from 'react-popper';
import useDebounce from 'react-use/lib/useDebounce';
import { css, cx } from 'emotion'; import { css, cx } from 'emotion';
// @ts-ignore
import PluginPrism from 'slate-prism'; import { SlatePrism } from '../../slate-plugins';
import { SCHEMA } from '../../utils/slate';
const modulo = (a: number, n: number) => a - n * Math.floor(a / n);
interface DataLinkInputProps { interface DataLinkInputProps {
value: string; value: string;
onChange: (url: string) => void; onChange: (url: string, callback?: () => void) => void;
suggestions: VariableSuggestion[]; suggestions: VariableSuggestion[];
} }
const plugins = [ const plugins = [
PluginPrism({ SlatePrism({
onlyIn: (node: any) => node.type === 'code_block', onlyIn: (node: any) => node.type === 'code_block',
getSyntax: () => 'links', getSyntax: () => 'links',
}), }),
]; ];
export const DataLinkInput: React.FC<DataLinkInputProps> = ({ value, onChange, suggestions }) => { export const DataLinkInput: React.FC<DataLinkInputProps> = ({ value, onChange, suggestions }) => {
const editorRef = useRef<Editor>() as RefObject<Editor>;
const theme = useContext(ThemeContext); const theme = useContext(ThemeContext);
const [showingSuggestions, setShowingSuggestions] = useState(false); const [showingSuggestions, setShowingSuggestions] = useState(false);
const [suggestionsIndex, setSuggestionsIndex] = useState(0); const [suggestionsIndex, setSuggestionsIndex] = useState(0);
const [usedSuggestions, setUsedSuggestions] = useState( const [linkUrl, setLinkUrl] = useState<Value>(makeValue(value));
suggestions.filter(suggestion => {
return value.indexOf(suggestion.value) > -1;
})
);
// Using any here as TS has problem pickung up `change` method existance on Value
// According to code and documentation `change` is an instance method on Value in slate 0.33.8 that we use
// https://github.com/ianstormtaylor/slate/blob/slate%400.33.8/docs/reference/slate/value.md#change
const [linkUrl, setLinkUrl] = useState<any>(makeValue(value));
const getStyles = useCallback(() => { const getStyles = useCallback(() => {
return { return {
@@ -55,99 +48,67 @@ export const DataLinkInput: React.FC<DataLinkInputProps> = ({ value, onChange, s
}; };
}, [theme]); }, [theme]);
const currentSuggestions = useMemo( // Workaround for https://github.com/ianstormtaylor/slate/issues/2927
() => const stateRef = useRef({ showingSuggestions, suggestions, suggestionsIndex, linkUrl, onChange });
suggestions.filter(suggestion => { stateRef.current = { showingSuggestions, suggestions, suggestionsIndex, linkUrl, onChange };
return usedSuggestions.map(s => s.value).indexOf(suggestion.value) === -1;
}),
[usedSuggestions, suggestions]
);
// SelectionReference is used to position the variables suggestion relatively to current DOM selection // SelectionReference is used to position the variables suggestion relatively to current DOM selection
const selectionRef = useMemo(() => new SelectionReference(), [setShowingSuggestions]); const selectionRef = useMemo(() => new SelectionReference(), [setShowingSuggestions, linkUrl]);
// Keep track of variables that has been used already const onKeyDown = React.useCallback((event: KeyboardEvent, next: () => any) => {
const updateUsedSuggestions = () => { if (!stateRef.current.showingSuggestions) {
const currentLink = Plain.serialize(linkUrl); if (event.key === '=' || event.key === '$' || (event.keyCode === 32 && event.ctrlKey)) {
const next = usedSuggestions.filter(suggestion => { return setShowingSuggestions(true);
return currentLink.indexOf(suggestion.value) > -1;
});
if (next.length !== usedSuggestions.length) {
setUsedSuggestions(next);
}
};
useDebounce(updateUsedSuggestions, 250, [linkUrl]);
const onKeyDown = (event: KeyboardEvent) => {
if (event.key === 'Backspace' || event.key === 'Escape') {
setShowingSuggestions(false);
setSuggestionsIndex(0);
}
if (event.key === 'Enter') {
if (showingSuggestions) {
onVariableSelect(currentSuggestions[suggestionsIndex]);
} }
return next();
} }
if (showingSuggestions) { switch (event.key) {
if (event.key === 'ArrowDown') { case 'Backspace':
case 'Escape':
setShowingSuggestions(false);
return setSuggestionsIndex(0);
case 'Enter':
event.preventDefault(); event.preventDefault();
setSuggestionsIndex(index => { return onVariableSelect(stateRef.current.suggestions[stateRef.current.suggestionsIndex]);
return (index + 1) % currentSuggestions.length;
}); case 'ArrowDown':
} case 'ArrowUp':
if (event.key === 'ArrowUp') {
event.preventDefault(); event.preventDefault();
setSuggestionsIndex(index => { const direction = event.key === 'ArrowDown' ? 1 : -1;
const nextIndex = index - 1 < 0 ? currentSuggestions.length - 1 : (index - 1) % currentSuggestions.length; return setSuggestionsIndex(index => modulo(index + direction, stateRef.current.suggestions.length));
return nextIndex; default:
}); return next();
}
} }
}, []);
if (event.key === '?' || event.key === '&' || event.key === '$' || (event.keyCode === 32 && event.ctrlKey)) { const onUrlChange = React.useCallback(({ value }: { value: Value }) => {
setShowingSuggestions(true);
}
if (event.key === 'Enter' && showingSuggestions) {
// Preventing entering a new line
// As of https://github.com/ianstormtaylor/slate/issues/1345#issuecomment-340508289
return false;
} else {
// @ts-ignore
return;
}
};
const onUrlChange = ({ value }: Change) => {
setLinkUrl(value); setLinkUrl(value);
}; }, []);
const onUrlBlur = () => { const onUrlBlur = React.useCallback((event: Event, editor: CoreEditor, next: () => any) => {
onChange(Plain.serialize(linkUrl)); // Callback needed for blur to work correctly
}; stateRef.current.onChange(Plain.serialize(stateRef.current.linkUrl), () => {
editorRef.current!.blur();
const onVariableSelect = (item: VariableSuggestion) => { });
const includeDollarSign = Plain.serialize(linkUrl).slice(-1) !== '$'; }, []);
const change = linkUrl.change();
const onVariableSelect = (item: VariableSuggestion, editor = editorRef.current!) => {
const includeDollarSign = Plain.serialize(editor.value).slice(-1) !== '$';
if (item.origin !== VariableOrigin.Template || item.value === DataLinkBuiltInVars.includeVars) { if (item.origin !== VariableOrigin.Template || item.value === DataLinkBuiltInVars.includeVars) {
change.insertText(`${includeDollarSign ? '$' : ''}\{${item.value}}`); editor.insertText(`${includeDollarSign ? '$' : ''}\{${item.value}}`);
} else { } else {
change.insertText(`var-${item.value}=$\{${item.value}}`); editor.insertText(`var-${item.value}=$\{${item.value}}`);
} }
setLinkUrl(change.value); setLinkUrl(editor.value);
setShowingSuggestions(false); setShowingSuggestions(false);
setUsedSuggestions((previous: VariableSuggestion[]) => {
return [...previous, item];
});
setSuggestionsIndex(0); setSuggestionsIndex(0);
onChange(Plain.serialize(change.value)); onChange(Plain.serialize(editor.value));
}; };
return ( return (
<div <div
className={cx( className={cx(
@@ -163,7 +124,7 @@ export const DataLinkInput: React.FC<DataLinkInputProps> = ({ value, onChange, s
<Portal> <Portal>
<ReactPopper <ReactPopper
referenceElement={selectionRef} referenceElement={selectionRef}
placement="auto-end" placement="top-end"
modifiers={{ modifiers={{
preventOverflow: { enabled: true, boundariesElement: 'window' }, preventOverflow: { enabled: true, boundariesElement: 'window' },
arrow: { enabled: false }, arrow: { enabled: false },
@@ -174,7 +135,7 @@ export const DataLinkInput: React.FC<DataLinkInputProps> = ({ value, onChange, s
return ( return (
<div ref={ref} style={style} data-placement={placement}> <div ref={ref} style={style} data-placement={placement}>
<DataLinkSuggestions <DataLinkSuggestions
suggestions={currentSuggestions} suggestions={stateRef.current.suggestions}
onSuggestionSelect={onVariableSelect} onSuggestionSelect={onVariableSelect}
onClose={() => setShowingSuggestions(false)} onClose={() => setShowingSuggestions(false)}
activeIndex={suggestionsIndex} activeIndex={suggestionsIndex}
@@ -186,11 +147,13 @@ export const DataLinkInput: React.FC<DataLinkInputProps> = ({ value, onChange, s
</Portal> </Portal>
)} )}
<Editor <Editor
schema={SCHEMA}
ref={editorRef}
placeholder="http://your-grafana.com/d/000000010/annotations" placeholder="http://your-grafana.com/d/000000010/annotations"
value={linkUrl} value={stateRef.current.linkUrl}
onChange={onUrlChange} onChange={onUrlChange}
onBlur={onUrlBlur} onBlur={onUrlBlur}
onKeyDown={onKeyDown} onKeyDown={(event, _editor, next) => onKeyDown(event as KeyboardEvent, next)}
plugins={plugins} plugins={plugins}
className={getStyles().editor} className={getStyles().editor}
/> />

View File

@@ -12,7 +12,7 @@ import { VariableSuggestion } from './DataLinkSuggestions';
interface DataLinksEditorProps { interface DataLinksEditorProps {
value: DataLink[]; value: DataLink[];
onChange: (links: DataLink[]) => void; onChange: (links: DataLink[], callback?: () => void) => void;
suggestions: VariableSuggestion[]; suggestions: VariableSuggestion[];
maxLinks?: number; maxLinks?: number;
} }
@@ -30,14 +30,15 @@ export const DataLinksEditor: FC<DataLinksEditorProps> = React.memo(({ value, on
onChange(value ? [...value, { url: '', title: '' }] : [{ url: '', title: '' }]); onChange(value ? [...value, { url: '', title: '' }] : [{ url: '', title: '' }]);
}; };
const onLinkChanged = (linkIndex: number, newLink: DataLink) => { const onLinkChanged = (linkIndex: number, newLink: DataLink, callback?: () => void) => {
onChange( onChange(
value.map((item, listIndex) => { value.map((item, listIndex) => {
if (linkIndex === listIndex) { if (linkIndex === listIndex) {
return newLink; return newLink;
} }
return item; return item;
}) }),
callback
); );
}; };

View File

@@ -83,9 +83,9 @@ $select-input-bg-disabled: $input-bg-disabled;
.gf-form-select-box__multi-value__remove { .gf-form-select-box__multi-value__remove {
text-align: center; text-align: center;
display: inline-block; display: inline-block;
height: 14px;
vertical-align: middle;
margin-left: 2px; margin-left: 2px;
position: relative;
top: 3px;
} }
.gf-form-select-box__multi-value__label { .gf-form-select-box__multi-value__label {
@@ -111,6 +111,10 @@ $select-input-bg-disabled: $input-bg-disabled;
} }
} }
.gf-form-select-box__placeholder {
color: $input-color-placeholder;
}
.gf-form-select-box__control--is-focused .gf-form-select-box__placeholder { .gf-form-select-box__control--is-focused .gf-form-select-box__placeholder {
display: none; display: none;
} }

View File

@@ -0,0 +1,34 @@
import React from 'react';
import { storiesOf } from '@storybook/react';
import { action } from '@storybook/addon-actions';
import { TimeZonePicker } from './TimeZonePicker';
import { UseState } from '../../utils/storybook/UseState';
import { withCenteredStory } from '../../utils/storybook/withCenteredStory';
const TimeZonePickerStories = storiesOf('UI/TimeZonePicker', module);
TimeZonePickerStories.addDecorator(withCenteredStory);
TimeZonePickerStories.add('default', () => {
return (
<UseState
initialState={{
value: 'europe/stockholm',
}}
>
{(value, updateValue) => {
return (
<TimeZonePicker
value={value.value}
onChange={newValue => {
action('on selected')(newValue);
updateValue({ value: newValue });
}}
width={20}
/>
);
}}
</UseState>
);
});

View File

@@ -0,0 +1,41 @@
import React, { FC } from 'react';
import { getTimeZoneGroups, SelectableValue } from '@grafana/data';
import { Select } from '..';
interface Props {
value: string;
width?: number;
onChange: (newValue: string) => void;
}
export const TimeZonePicker: FC<Props> = ({ onChange, value, width }) => {
const timeZoneGroups = getTimeZoneGroups();
const groupOptions = timeZoneGroups.map(group => {
const options = group.options.map(timeZone => {
return {
label: timeZone,
value: timeZone,
};
});
return {
label: group.label,
options,
};
});
const selectedValue = groupOptions.map(group => {
return group.options.find(option => option.value === value);
});
return (
<Select
options={groupOptions}
value={selectedValue}
onChange={(newValue: SelectableValue) => onChange(newValue.value)}
width={width}
/>
);
};

View File

@@ -11,7 +11,7 @@ export const ReduceTransformerEditor: React.FC<TransformerUIProps<ReduceTransfor
}) => { }) => {
return ( return (
<StatsPicker <StatsPicker
width={12} width={25}
placeholder="Choose Stat" placeholder="Choose Stat"
allowMultiple allowMultiple
stats={options.reducers || []} stats={options.reducers || []}

View File

@@ -2,3 +2,4 @@ export * from './components';
export * from './types'; export * from './types';
export * from './utils'; export * from './utils';
export * from './themes'; export * from './themes';
export * from './slate-plugins';

View File

@@ -0,0 +1 @@
export { SlatePrism } from './slate-prism';

View File

@@ -0,0 +1,3 @@
const TOKEN_MARK = 'prism-token';
export default TOKEN_MARK;

View File

@@ -0,0 +1,160 @@
import Prism from 'prismjs';
import { Block, Text, Decoration } from 'slate';
import { Plugin } from '@grafana/slate-react';
import Options, { OptionsFormat } from './options';
import TOKEN_MARK from './TOKEN_MARK';
/**
* A Slate plugin to highlight code syntax.
*/
export function SlatePrism(optsParam: OptionsFormat = {}): Plugin {
const opts: Options = new Options(optsParam);
return {
decorateNode: (node, editor, next) => {
if (!opts.onlyIn(node)) {
return next();
}
return decorateNode(opts, Block.create(node as Block));
},
renderDecoration: (props, editor, next) =>
opts.renderDecoration(
{
children: props.children,
decoration: props.decoration,
},
editor as any,
next
),
};
}
/**
* Returns the decoration for a node
*/
function decorateNode(opts: Options, block: Block) {
const grammarName = opts.getSyntax(block);
const grammar = Prism.languages[grammarName];
if (!grammar) {
// Grammar not loaded
return [];
}
// Tokenize the whole block text
const texts = block.getTexts();
const blockText = texts.map(text => text && text.getText()).join('\n');
const tokens = Prism.tokenize(blockText, grammar);
// The list of decorations to return
const decorations: Decoration[] = [];
let textStart = 0;
let textEnd = 0;
texts.forEach(text => {
textEnd = textStart + text!.getText().length;
let offset = 0;
function processToken(token: string | Prism.Token, accu?: string | number) {
if (typeof token === 'string') {
if (accu) {
const decoration = createDecoration({
text: text!,
textStart,
textEnd,
start: offset,
end: offset + token.length,
className: `prism-token token ${accu}`,
block,
});
if (decoration) {
decorations.push(decoration);
}
}
offset += token.length;
} else {
accu = `${accu} ${token.type} ${token.alias || ''}`;
if (typeof token.content === 'string') {
const decoration = createDecoration({
text: text!,
textStart,
textEnd,
start: offset,
end: offset + token.content.length,
className: `prism-token token ${accu}`,
block,
});
if (decoration) {
decorations.push(decoration);
}
offset += token.content.length;
} else {
// When using token.content instead of token.matchedStr, token can be deep
for (let i = 0; i < token.content.length; i += 1) {
// @ts-ignore
processToken(token.content[i], accu);
}
}
}
}
tokens.forEach(processToken);
textStart = textEnd + 1; // account for added `\n`
});
return decorations;
}
/**
* Return a decoration range for the given text.
*/
function createDecoration({
text,
textStart,
textEnd,
start,
end,
className,
block,
}: {
text: Text; // The text being decorated
textStart: number; // Its start position in the whole text
textEnd: number; // Its end position in the whole text
start: number; // The position in the whole text where the token starts
end: number; // The position in the whole text where the token ends
className: string; // The prism token classname
block: Block;
}): Decoration | null {
if (start >= textEnd || end <= textStart) {
// Ignore, the token is not in the text
return null;
}
// Shrink to this text boundaries
start = Math.max(start, textStart);
end = Math.min(end, textEnd);
// Now shift offsets to be relative to this text
start -= textStart;
end -= textStart;
const myDec = block.createDecoration({
object: 'decoration',
anchor: {
key: text.key,
offset: start,
object: 'point',
},
focus: {
key: text.key,
offset: end,
object: 'point',
},
type: TOKEN_MARK,
data: { className },
});
return myDec;
}

View File

@@ -0,0 +1,77 @@
import React from 'react';
import { Mark, Node, Decoration } from 'slate';
import { Editor } from '@grafana/slate-react';
import { Record } from 'immutable';
import TOKEN_MARK from './TOKEN_MARK';
export interface OptionsFormat {
// Determine which node should be highlighted
onlyIn?: (node: Node) => boolean;
// Returns the syntax for a node that should be highlighted
getSyntax?: (node: Node) => string;
// Render a highlighting mark in a highlighted node
renderMark?: ({ mark, children }: { mark: Mark; children: React.ReactNode }) => void | React.ReactNode;
}
/**
* Default filter for code blocks
*/
function defaultOnlyIn(node: Node): boolean {
return node.object === 'block' && node.type === 'code_block';
}
/**
* Default getter for syntax
*/
function defaultGetSyntax(node: Node): string {
return 'javascript';
}
/**
* Default rendering for decorations
*/
function defaultRenderDecoration(
props: { children: React.ReactNode; decoration: Decoration },
editor: Editor,
next: () => any
): void | React.ReactNode {
const { decoration } = props;
if (decoration.type !== TOKEN_MARK) {
return next();
}
const className = decoration.data.get('className');
return <span className={className}>{props.children}</span>;
}
/**
* The plugin options
*/
class Options
extends Record({
onlyIn: defaultOnlyIn,
getSyntax: defaultGetSyntax,
renderDecoration: defaultRenderDecoration,
})
implements OptionsFormat {
readonly onlyIn!: (node: Node) => boolean;
readonly getSyntax!: (node: Node) => string;
readonly renderDecoration!: (
{
decoration,
children,
}: {
decoration: Decoration;
children: React.ReactNode;
},
editor: Editor,
next: () => any
) => void | React.ReactNode;
constructor(props: OptionsFormat) {
super(props);
}
}
export default Options;

View File

@@ -193,6 +193,7 @@ $btn-semi-transparent: rgba(0, 0, 0, 0.2) !default;
// sidemenu // sidemenu
$side-menu-width: 60px; $side-menu-width: 60px;
$navbar-padding: 20px;
// dashboard // dashboard
$dashboard-padding: $space-md; $dashboard-padding: $space-md;

View File

@@ -16,6 +16,8 @@ export interface PanelData {
series: DataFrame[]; series: DataFrame[];
request?: DataQueryRequest; request?: DataQueryRequest;
error?: DataQueryError; error?: DataQueryError;
// Contains the range from the request or a shifted time range if a request uses relative time
timeRange: TimeRange;
} }
export interface PanelProps<T = any> { export interface PanelProps<T = any> {

View File

@@ -10,6 +10,7 @@ export enum PluginType {
panel = 'panel', panel = 'panel',
datasource = 'datasource', datasource = 'datasource',
app = 'app', app = 'app',
renderer = 'renderer',
} }
export interface PluginMeta<T extends {} = KeyValue> { export interface PluginMeta<T extends {} = KeyValue> {

View File

@@ -50,13 +50,13 @@ function getTitleTemplate(title: string | undefined, stats: string[], data?: Dat
const parts: string[] = []; const parts: string[] = [];
if (stats.length > 1) { if (stats.length > 1) {
parts.push('$' + VAR_CALC); parts.push('${' + VAR_CALC + '}');
} }
if (data.length > 1) { if (data.length > 1) {
parts.push('${' + VAR_SERIES_NAME + '}'); parts.push('${' + VAR_SERIES_NAME + '}');
} }
if (fieldCount > 1 || !parts.length) { if (fieldCount > 1 || !parts.length) {
parts.push('$' + VAR_FIELD_NAME); parts.push('${' + VAR_FIELD_NAME + '}');
} }
return parts.join(' '); return parts.join(' ');
} }

View File

@@ -1,22 +1,22 @@
// @ts-ignore import { Block, Document, Text, Value, SchemaProperties } from 'slate';
import { Block, Document, Text, Value } from 'slate';
const SCHEMA = { export const SCHEMA: SchemaProperties = {
blocks: { document: {
paragraph: 'paragraph', nodes: [
codeblock: 'code_block', {
codeline: 'code_line', match: [{ type: 'paragraph' }, { type: 'code_block' }, { type: 'code_line' }],
},
],
}, },
inlines: {}, inlines: {},
marks: {},
}; };
export const makeFragment = (text: string, syntax?: string) => { export const makeFragment = (text: string, syntax?: string): Document => {
const lines = text.split('\n').map(line => const lines = text.split('\n').map(line =>
Block.create({ Block.create({
type: 'code_line', type: 'code_line',
nodes: [Text.create(line)], nodes: [Text.create(line)],
} as any) })
); );
const block = Block.create({ const block = Block.create({
@@ -25,18 +25,17 @@ export const makeFragment = (text: string, syntax?: string) => {
}, },
type: 'code_block', type: 'code_block',
nodes: lines, nodes: lines,
} as any); });
return Document.create({ return Document.create({
nodes: [block], nodes: [block],
}); });
}; };
export const makeValue = (text: string, syntax?: string) => { export const makeValue = (text: string, syntax?: string): Value => {
const fragment = makeFragment(text, syntax); const fragment = makeFragment(text, syntax);
return Value.create({ return Value.create({
document: fragment, document: fragment,
SCHEMA, });
} as any);
}; };

View File

@@ -154,7 +154,7 @@ export const getCategories = (): ValueFormatCategory[] => [
{ name: 'gigabytes/sec', id: 'GBs', fn: decimalSIPrefix('Bs', 3) }, { name: 'gigabytes/sec', id: 'GBs', fn: decimalSIPrefix('Bs', 3) },
{ name: 'gigabits/sec', id: 'Gbits', fn: decimalSIPrefix('bps', 3) }, { name: 'gigabits/sec', id: 'Gbits', fn: decimalSIPrefix('bps', 3) },
{ name: 'terabytes/sec', id: 'TBs', fn: decimalSIPrefix('Bs', 4) }, { name: 'terabytes/sec', id: 'TBs', fn: decimalSIPrefix('Bs', 4) },
{ name: 'terabits/sec', id: 'Gbits', fn: decimalSIPrefix('bps', 4) }, { name: 'terabits/sec', id: 'Tbits', fn: decimalSIPrefix('bps', 4) },
{ name: 'petabytes/sec', id: 'PBs', fn: decimalSIPrefix('Bs', 5) }, { name: 'petabytes/sec', id: 'PBs', fn: decimalSIPrefix('Bs', 5) },
{ name: 'petabits/sec', id: 'Pbits', fn: decimalSIPrefix('bps', 5) }, { name: 'petabits/sec', id: 'Pbits', fn: decimalSIPrefix('bps', 5) },
], ],

View File

@@ -1,6 +1,14 @@
import { toFixed, getValueFormat } from './valueFormats'; import { toFixed, getValueFormat } from './valueFormats';
describe('valueFormats', () => { describe('valueFormats', () => {
describe('toFixed with edge cases', () => {
it('should handle non number input gracefully', () => {
expect(toFixed(NaN)).toBe('NaN');
expect(toFixed(Number.NEGATIVE_INFINITY)).toBe('-Inf');
expect(toFixed(Number.POSITIVE_INFINITY)).toBe('Inf');
});
});
describe('toFixed and negative decimals', () => { describe('toFixed and negative decimals', () => {
it('should treat as zero decimals', () => { it('should treat as zero decimals', () => {
const str = toFixed(186.123, -2); const str = toFixed(186.123, -2);

View File

@@ -33,6 +33,12 @@ export function toFixed(value: number, decimals?: DecimalCount): string {
if (value === null) { if (value === null) {
return ''; return '';
} }
if (value === Number.NEGATIVE_INFINITY) {
return '-Inf';
}
if (value === Number.POSITIVE_INFINITY) {
return 'Inf';
}
const factor = decimals ? Math.pow(10, Math.max(0, decimals)) : 1; const factor = decimals ? Math.pow(10, Math.max(0, decimals)) : 1;
const formatted = String(Math.round(value * factor) / factor); const formatted = String(Math.round(value * factor) / factor);

View File

@@ -5,6 +5,10 @@
"compilerOptions": { "compilerOptions": {
"rootDirs": [".", "stories"], "rootDirs": [".", "stories"],
"typeRoots": ["./node_modules/@types", "types"], "typeRoots": ["./node_modules/@types", "types"],
"baseUrl": "./node_modules/@types",
"paths": {
"@grafana/slate-react": ["slate-react"]
},
"declarationDir": "dist", "declarationDir": "dist",
"outDir": "compiled" "outDir": "compiled"
} }

View File

@@ -23,7 +23,8 @@ ENV PATH=/usr/share/grafana/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bi
WORKDIR $GF_PATHS_HOME WORKDIR $GF_PATHS_HOME
RUN apk add --no-cache ca-certificates bash RUN apk add --no-cache ca-certificates bash && \
apk add --no-cache --upgrade --repository=http://dl-cdn.alpinelinux.org/alpine/edge/main openssl musl-utils
# PhantomJS # PhantomJS
RUN if [ `arch` = "x86_64" ]; then \ RUN if [ `arch` = "x86_64" ]; then \

View File

@@ -135,3 +135,15 @@ func Respond(status int, body interface{}) *NormalResponse {
header: make(http.Header), header: make(http.Header),
} }
} }
type RedirectResponse struct {
location string
}
func (r *RedirectResponse) WriteTo(ctx *m.ReqContext) {
ctx.Redirect(r.location)
}
func Redirect(location string) *RedirectResponse {
return &RedirectResponse{location: location}
}

View File

@@ -34,7 +34,7 @@ type LDAPAttribute struct {
} }
// RoleDTO is a serializer for mapped roles from LDAP // RoleDTO is a serializer for mapped roles from LDAP
type RoleDTO struct { type LDAPRoleDTO struct {
OrgId int64 `json:"orgId"` OrgId int64 `json:"orgId"`
OrgName string `json:"orgName"` OrgName string `json:"orgName"`
OrgRole models.RoleType `json:"orgRole"` OrgRole models.RoleType `json:"orgRole"`
@@ -49,7 +49,7 @@ type LDAPUserDTO struct {
Username *LDAPAttribute `json:"login"` Username *LDAPAttribute `json:"login"`
IsGrafanaAdmin *bool `json:"isGrafanaAdmin"` IsGrafanaAdmin *bool `json:"isGrafanaAdmin"`
IsDisabled bool `json:"isDisabled"` IsDisabled bool `json:"isDisabled"`
OrgRoles []RoleDTO `json:"roles"` OrgRoles []LDAPRoleDTO `json:"roles"`
Teams []models.TeamOrgGroupDTO `json:"teams"` Teams []models.TeamOrgGroupDTO `json:"teams"`
} }
@@ -90,6 +90,10 @@ func (user *LDAPUserDTO) FetchOrgs() error {
} }
for i, orgDTO := range user.OrgRoles { for i, orgDTO := range user.OrgRoles {
if orgDTO.OrgId < 1 {
continue
}
orgName := orgNamesById[orgDTO.OrgId] orgName := orgNamesById[orgDTO.OrgId]
if orgName != "" { if orgName != "" {
@@ -256,7 +260,7 @@ func (server *HTTPServer) GetUserFromLDAP(c *models.ReqContext) Response {
user, serverConfig, err := ldap.User(username) user, serverConfig, err := ldap.User(username)
if user == nil { if user == nil {
return Error(http.StatusNotFound, "No user was found on the LDAP server(s)", err) return Error(http.StatusNotFound, "No user was found in the LDAP server(s) with that username", err)
} }
logger.Debug("user found", "user", user) logger.Debug("user found", "user", user)
@@ -272,22 +276,32 @@ func (server *HTTPServer) GetUserFromLDAP(c *models.ReqContext) Response {
IsDisabled: user.IsDisabled, IsDisabled: user.IsDisabled,
} }
orgRoles := []RoleDTO{} orgRoles := []LDAPRoleDTO{}
for _, g := range serverConfig.Groups { // First, let's find the groupDN that we did match by inspecting the assigned user OrgRoles.
role := &RoleDTO{} for _, group := range serverConfig.Groups {
orgRole, ok := user.OrgRoles[group.OrgId]
if isMatchToLDAPGroup(user, g) { if ok && orgRole == group.OrgRole {
role.OrgId = g.OrgID r := &LDAPRoleDTO{GroupDN: group.GroupDN, OrgId: group.OrgId, OrgRole: group.OrgRole}
role.OrgRole = user.OrgRoles[g.OrgID] orgRoles = append(orgRoles, *r)
role.GroupDN = g.GroupDN }
}
orgRoles = append(orgRoles, *role) // Then, we find what we did not match by inspecting the list of groups returned from
} else { // LDAP against what we have already matched above.
role.OrgId = g.OrgID for _, userGroup := range user.Groups {
role.GroupDN = g.GroupDN var matches int
orgRoles = append(orgRoles, *role) for _, orgRole := range orgRoles {
if orgRole.GroupDN == userGroup { // we already matched it
matches++
}
}
if matches < 1 {
r := &LDAPRoleDTO{GroupDN: userGroup}
orgRoles = append(orgRoles, *r)
} }
} }
@@ -312,12 +326,6 @@ func (server *HTTPServer) GetUserFromLDAP(c *models.ReqContext) Response {
return JSON(200, u) return JSON(200, u)
} }
// isMatchToLDAPGroup determines if we were able to match an LDAP group to an organization+role.
// Since we allow one role per organization. If it's set, we were able to match it.
func isMatchToLDAPGroup(user *models.ExternalUserInfo, groupConfig *ldap.GroupToOrgRole) bool {
return user.OrgRoles[groupConfig.OrgID] == groupConfig.OrgRole
}
// splitName receives the full name of a user and splits it into two parts: A name and a surname. // splitName receives the full name of a user and splits it into two parts: A name and a surname.
func splitName(name string) (string, string) { func splitName(name string) (string, string) {
names := util.SplitString(name) names := util.SplitString(name)

View File

@@ -94,7 +94,7 @@ func TestGetUserFromLDAPApiEndpoint_UserNotFound(t *testing.T) {
sc := getUserFromLDAPContext(t, "/api/admin/ldap/user-that-does-not-exist") sc := getUserFromLDAPContext(t, "/api/admin/ldap/user-that-does-not-exist")
require.Equal(t, sc.resp.Code, http.StatusNotFound) require.Equal(t, sc.resp.Code, http.StatusNotFound)
assert.JSONEq(t, "{\"message\":\"No user was found on the LDAP server(s)\"}", sc.resp.Body.String()) assert.JSONEq(t, "{\"message\":\"No user was found in the LDAP server(s) with that username\"}", sc.resp.Body.String())
} }
func TestGetUserFromLDAPApiEndpoint_OrgNotfound(t *testing.T) { func TestGetUserFromLDAPApiEndpoint_OrgNotfound(t *testing.T) {
@@ -103,6 +103,7 @@ func TestGetUserFromLDAPApiEndpoint_OrgNotfound(t *testing.T) {
Name: "John Doe", Name: "John Doe",
Email: "john.doe@example.com", Email: "john.doe@example.com",
Login: "johndoe", Login: "johndoe",
Groups: []string{"cn=admins,ou=groups,dc=grafana,dc=org"},
OrgRoles: map[int64]models.RoleType{1: models.ROLE_ADMIN, 2: models.ROLE_VIEWER}, OrgRoles: map[int64]models.RoleType{1: models.ROLE_ADMIN, 2: models.ROLE_VIEWER},
IsGrafanaAdmin: &isAdmin, IsGrafanaAdmin: &isAdmin,
} }
@@ -117,12 +118,12 @@ func TestGetUserFromLDAPApiEndpoint_OrgNotfound(t *testing.T) {
Groups: []*ldap.GroupToOrgRole{ Groups: []*ldap.GroupToOrgRole{
{ {
GroupDN: "cn=admins,ou=groups,dc=grafana,dc=org", GroupDN: "cn=admins,ou=groups,dc=grafana,dc=org",
OrgID: 1, OrgId: 1,
OrgRole: models.ROLE_ADMIN, OrgRole: models.ROLE_ADMIN,
}, },
{ {
GroupDN: "cn=admins,ou=groups,dc=grafana2,dc=org", GroupDN: "cn=admins,ou=groups,dc=grafana2,dc=org",
OrgID: 2, OrgId: 2,
OrgRole: models.ROLE_VIEWER, OrgRole: models.ROLE_VIEWER,
}, },
}, },
@@ -164,6 +165,7 @@ func TestGetUserFromLDAPApiEndpoint(t *testing.T) {
Name: "John Doe", Name: "John Doe",
Email: "john.doe@example.com", Email: "john.doe@example.com",
Login: "johndoe", Login: "johndoe",
Groups: []string{"cn=admins,ou=groups,dc=grafana,dc=org", "another-group-not-matched"},
OrgRoles: map[int64]models.RoleType{1: models.ROLE_ADMIN}, OrgRoles: map[int64]models.RoleType{1: models.ROLE_ADMIN},
IsGrafanaAdmin: &isAdmin, IsGrafanaAdmin: &isAdmin,
} }
@@ -178,7 +180,7 @@ func TestGetUserFromLDAPApiEndpoint(t *testing.T) {
Groups: []*ldap.GroupToOrgRole{ Groups: []*ldap.GroupToOrgRole{
{ {
GroupDN: "cn=admins,ou=groups,dc=grafana,dc=org", GroupDN: "cn=admins,ou=groups,dc=grafana,dc=org",
OrgID: 1, OrgId: 1,
OrgRole: models.ROLE_ADMIN, OrgRole: models.ROLE_ADMIN,
}, },
}, },
@@ -203,7 +205,7 @@ func TestGetUserFromLDAPApiEndpoint(t *testing.T) {
sc := getUserFromLDAPContext(t, "/api/admin/ldap/johndoe") sc := getUserFromLDAPContext(t, "/api/admin/ldap/johndoe")
require.Equal(t, sc.resp.Code, http.StatusOK) assert.Equal(t, sc.resp.Code, http.StatusOK)
expected := ` expected := `
{ {
@@ -222,7 +224,8 @@ func TestGetUserFromLDAPApiEndpoint(t *testing.T) {
"isGrafanaAdmin": true, "isGrafanaAdmin": true,
"isDisabled": false, "isDisabled": false,
"roles": [ "roles": [
{ "orgId": 1, "orgRole": "Admin", "orgName": "Main Org.", "groupDN": "cn=admins,ou=groups,dc=grafana,dc=org" } { "orgId": 1, "orgRole": "Admin", "orgName": "Main Org.", "groupDN": "cn=admins,ou=groups,dc=grafana,dc=org" },
{ "orgId": 0, "orgRole": "", "orgName": "", "groupDN": "another-group-not-matched" }
], ],
"teams": null "teams": null
} }
@@ -251,7 +254,7 @@ func TestGetUserFromLDAPApiEndpoint_WithTeamHandler(t *testing.T) {
Groups: []*ldap.GroupToOrgRole{ Groups: []*ldap.GroupToOrgRole{
{ {
GroupDN: "cn=admins,ou=groups,dc=grafana,dc=org", GroupDN: "cn=admins,ou=groups,dc=grafana,dc=org",
OrgID: 1, OrgId: 1,
OrgRole: models.ROLE_ADMIN, OrgRole: models.ROLE_ADMIN,
}, },
}, },

View File

@@ -157,32 +157,34 @@ func latestSupportedVersion(plugin *m.Plugin) *m.Version {
// SelectVersion returns latest version if none is specified or the specified version. If the version string is not // SelectVersion returns latest version if none is specified or the specified version. If the version string is not
// matched to existing version it errors out. It also errors out if version that is matched is not available for current // matched to existing version it errors out. It also errors out if version that is matched is not available for current
// os and platform. // os and platform. It expects plugin.Versions to be sorted so the newest version is first.
func SelectVersion(plugin *m.Plugin, version string) (*m.Version, error) { func SelectVersion(plugin *m.Plugin, version string) (*m.Version, error) {
var ver *m.Version var ver m.Version
if version == "" {
ver = &plugin.Versions[0]
}
for _, v := range plugin.Versions {
if v.Version == version {
ver = &v
}
}
if ver == nil {
return nil, xerrors.New("Could not find the version you're looking for")
}
latestForArch := latestSupportedVersion(plugin) latestForArch := latestSupportedVersion(plugin)
if latestForArch == nil { if latestForArch == nil {
return nil, xerrors.New("Plugin is not supported on your architecture and os.") return nil, xerrors.New("Plugin is not supported on your architecture and os.")
} }
if latestForArch.Version == ver.Version { if version == "" {
return ver, nil return latestForArch, nil
} }
return nil, xerrors.Errorf("Version you want is not supported on your architecture and os. Latest suitable version is %v", latestForArch.Version) for _, v := range plugin.Versions {
if v.Version == version {
ver = v
break
}
}
if len(ver.Version) == 0 {
return nil, xerrors.New("Could not find the version you're looking for")
}
if !supportsCurrentArch(&ver) {
return nil, xerrors.Errorf("Version you want is not supported on your architecture and os. Latest suitable version is %v", latestForArch.Version)
}
return &ver, nil
} }
func RemoveGitBuildFromName(pluginName, filename string) string { func RemoveGitBuildFromName(pluginName, filename string) string {

View File

@@ -14,7 +14,7 @@ import (
"github.com/stretchr/testify/assert" "github.com/stretchr/testify/assert"
) )
func TestFoldernameReplacement(t *testing.T) { func TestFolderNameReplacement(t *testing.T) {
Convey("path containing git commit path", t, func() { Convey("path containing git commit path", t, func() {
pluginName := "datasource-plugin-kairosdb" pluginName := "datasource-plugin-kairosdb"
@@ -134,7 +134,68 @@ func TestIsPathSafe(t *testing.T) {
assert.False(t, isPathSafe("../../", dest)) assert.False(t, isPathSafe("../../", dest))
assert.False(t, isPathSafe("../../test", dest)) assert.False(t, isPathSafe("../../test", dest))
}) })
}
func TestSelectVersion(t *testing.T) {
t.Run("Should return error when requested version does not exist", func(t *testing.T) {
_, err := SelectVersion(
makePluginWithVersions(versionArg{Version: "version"}),
"1.1.1",
)
assert.NotNil(t, err)
})
t.Run("Should return error when no version supports current arch", func(t *testing.T) {
_, err := SelectVersion(
makePluginWithVersions(versionArg{Version: "version", Arch: []string{"non-existent"}}),
"",
)
assert.NotNil(t, err)
})
t.Run("Should return error when requested version does not support current arch", func(t *testing.T) {
_, err := SelectVersion(
makePluginWithVersions(
versionArg{Version: "2.0.0"},
versionArg{Version: "1.1.1", Arch: []string{"non-existent"}},
),
"1.1.1",
)
assert.NotNil(t, err)
})
t.Run("Should return latest available for arch when no version specified", func(t *testing.T) {
ver, err := SelectVersion(
makePluginWithVersions(
versionArg{Version: "2.0.0", Arch: []string{"non-existent"}},
versionArg{Version: "1.0.0"},
),
"",
)
assert.Nil(t, err)
assert.Equal(t, "1.0.0", ver.Version)
})
t.Run("Should return latest version when no version specified", func(t *testing.T) {
ver, err := SelectVersion(
makePluginWithVersions(versionArg{Version: "2.0.0"}, versionArg{Version: "1.0.0"}),
"",
)
assert.Nil(t, err)
assert.Equal(t, "2.0.0", ver.Version)
})
t.Run("Should return requested version", func(t *testing.T) {
ver, err := SelectVersion(
makePluginWithVersions(
versionArg{Version: "2.0.0"},
versionArg{Version: "1.0.0"},
),
"1.0.0",
)
assert.Nil(t, err)
assert.Equal(t, "1.0.0", ver.Version)
})
} }
func setupPluginInstallCmd(t *testing.T, pluginDir string) utils.CommandLine { func setupPluginInstallCmd(t *testing.T, pluginDir string) utils.CommandLine {
@@ -199,3 +260,35 @@ func skipWindows(t *testing.T) {
t.Skip("Skipping test on Windows") t.Skip("Skipping test on Windows")
} }
} }
type versionArg struct {
Version string
Arch []string
}
func makePluginWithVersions(versions ...versionArg) *models.Plugin {
plugin := &models.Plugin{
Id: "",
Category: "",
Versions: []models.Version{},
}
for _, version := range versions {
ver := models.Version{
Version: version.Version,
Commit: fmt.Sprintf("commit_%s", version.Version),
Url: fmt.Sprintf("url_%s", version.Version),
}
if version.Arch != nil {
ver.Arch = map[string]models.ArchMeta{}
for _, arch := range version.Arch {
ver.Arch[arch] = models.ArchMeta{
Md5: fmt.Sprintf("md5_%s", arch),
}
}
}
plugin.Versions = append(plugin.Versions, ver)
}
return plugin
}

View File

@@ -217,7 +217,7 @@ func (scanner *PluginScanner) loadPluginJson(pluginJsonFilePath string) error {
loader = reflect.New(reflect.TypeOf(pluginGoType)).Interface().(PluginLoader) loader = reflect.New(reflect.TypeOf(pluginGoType)).Interface().(PluginLoader)
// External plugins need a module.js file for SystemJS to load // External plugins need a module.js file for SystemJS to load
if !strings.HasPrefix(pluginJsonFilePath, setting.StaticRootPath) { if !strings.HasPrefix(pluginJsonFilePath, setting.StaticRootPath) && !scanner.IsBackendOnlyPlugin(pluginCommon.Type) {
module := filepath.Join(filepath.Dir(pluginJsonFilePath), "module.js") module := filepath.Join(filepath.Dir(pluginJsonFilePath), "module.js")
if _, err := os.Stat(module); os.IsNotExist(err) { if _, err := os.Stat(module); os.IsNotExist(err) {
plog.Warn("Plugin missing module.js", plog.Warn("Plugin missing module.js",
@@ -231,6 +231,10 @@ func (scanner *PluginScanner) loadPluginJson(pluginJsonFilePath string) error {
return loader.Load(jsonParser, currentDir) return loader.Load(jsonParser, currentDir)
} }
func (scanner *PluginScanner) IsBackendOnlyPlugin(pluginType string) bool {
return pluginType == "renderer"
}
func GetPluginMarkdown(pluginId string, name string) ([]byte, error) { func GetPluginMarkdown(pluginId string, name string) ([]byte, error) {
plug, exists := Plugins[pluginId] plug, exists := Plugins[pluginId]
if !exists { if !exists {

View File

@@ -42,4 +42,18 @@ func TestPluginScans(t *testing.T) {
So(Apps["test-app"].Info.Screenshots[1].Path, ShouldEqual, "public/plugins/test-app/img/screenshot2.png") So(Apps["test-app"].Info.Screenshots[1].Path, ShouldEqual, "public/plugins/test-app/img/screenshot2.png")
}) })
Convey("When checking if renderer is backend only plugin", t, func() {
pluginScanner := &PluginScanner{}
result := pluginScanner.IsBackendOnlyPlugin("renderer")
So(result, ShouldEqual, true)
})
Convey("When checking if app is backend only plugin", t, func() {
pluginScanner := &PluginScanner{}
result := pluginScanner.IsBackendOnlyPlugin("app")
So(result, ShouldEqual, false)
})
} }

View File

@@ -88,7 +88,13 @@ func (pn *PagerdutyNotifier) Notify(evalContext *alerting.EvalContext) error {
pn.log.Info("Notifying Pagerduty", "event_type", eventType) pn.log.Info("Notifying Pagerduty", "event_type", eventType)
payloadJSON := simplejson.New() payloadJSON := simplejson.New()
payloadJSON.Set("summary", evalContext.Rule.Name+" - "+evalContext.Rule.Message)
summary := evalContext.Rule.Name + " - " + evalContext.Rule.Message
if len(summary) > 1024 {
summary = summary[0:1024]
}
payloadJSON.Set("summary", summary)
if hostname, err := os.Hostname(); err == nil { if hostname, err := os.Hostname(); err == nil {
payloadJSON.Set("source", hostname) payloadJSON.Set("source", hostname)
} }

View File

@@ -408,12 +408,12 @@ func (server *Server) buildGrafanaUser(user *ldap.Entry) (*models.ExternalUserIn
for _, group := range server.Config.Groups { for _, group := range server.Config.Groups {
// only use the first match for each org // only use the first match for each org
if extUser.OrgRoles[group.OrgID] != "" { if extUser.OrgRoles[group.OrgId] != "" {
continue continue
} }
if isMemberOf(memberOf, group.GroupDN) { if isMemberOf(memberOf, group.GroupDN) {
extUser.OrgRoles[group.OrgID] = group.OrgRole extUser.OrgRoles[group.OrgId] = group.OrgRole
if extUser.IsGrafanaAdmin == nil || !*extUser.IsGrafanaAdmin { if extUser.IsGrafanaAdmin == nil || !*extUser.IsGrafanaAdmin {
extUser.IsGrafanaAdmin = group.IsGrafanaAdmin extUser.IsGrafanaAdmin = group.IsGrafanaAdmin
} }

View File

@@ -3,11 +3,10 @@ package ldap
import ( import (
"testing" "testing"
. "github.com/smartystreets/goconvey/convey"
"gopkg.in/ldap.v3"
"github.com/grafana/grafana/pkg/infra/log" "github.com/grafana/grafana/pkg/infra/log"
"github.com/grafana/grafana/pkg/models" "github.com/grafana/grafana/pkg/models"
. "github.com/smartystreets/goconvey/convey"
"gopkg.in/ldap.v3"
) )
func TestLDAPPrivateMethods(t *testing.T) { func TestLDAPPrivateMethods(t *testing.T) {
@@ -124,7 +123,7 @@ func TestLDAPPrivateMethods(t *testing.T) {
Config: &ServerConfig{ Config: &ServerConfig{
Groups: []*GroupToOrgRole{ Groups: []*GroupToOrgRole{
{ {
OrgID: 1, OrgId: 1,
}, },
}, },
}, },
@@ -162,7 +161,7 @@ func TestLDAPPrivateMethods(t *testing.T) {
Config: &ServerConfig{ Config: &ServerConfig{
Groups: []*GroupToOrgRole{ Groups: []*GroupToOrgRole{
{ {
OrgID: 1, OrgId: 1,
}, },
}, },
}, },

View File

@@ -55,7 +55,7 @@ type AttributeMap struct {
// config "group_mappings" setting // config "group_mappings" setting
type GroupToOrgRole struct { type GroupToOrgRole struct {
GroupDN string `toml:"group_dn"` GroupDN string `toml:"group_dn"`
OrgID int64 `toml:"org_id"` OrgId int64 `toml:"org_id"`
// This pointer specifies if setting was set (for backwards compatibility) // This pointer specifies if setting was set (for backwards compatibility)
IsGrafanaAdmin *bool `toml:"grafana_admin"` IsGrafanaAdmin *bool `toml:"grafana_admin"`
@@ -139,8 +139,8 @@ func readConfig(configFile string) (*Config, error) {
} }
for _, groupMap := range server.Groups { for _, groupMap := range server.Groups {
if groupMap.OrgID == 0 { if groupMap.OrgId == 0 {
groupMap.OrgID = 1 groupMap.OrgId = 1
} }
} }
} }

View File

@@ -3,10 +3,14 @@ package multildap
import ( import (
"errors" "errors"
"github.com/grafana/grafana/pkg/infra/log"
"github.com/grafana/grafana/pkg/models" "github.com/grafana/grafana/pkg/models"
"github.com/grafana/grafana/pkg/services/ldap" "github.com/grafana/grafana/pkg/services/ldap"
) )
// logger to log
var logger = log.New("ldap")
// GetConfig gets LDAP config // GetConfig gets LDAP config
var GetConfig = ldap.GetConfig var GetConfig = ldap.GetConfig
@@ -119,12 +123,18 @@ func (multiples *MultiLDAP) Login(query *models.LoginUserQuery) (
return user, nil return user, nil
} }
// Continue if we couldn't find the user
if err == ErrCouldNotFindUser {
continue
}
if err != nil { if err != nil {
if isSilentError(err) {
logger.Debug(
"unable to login with LDAP - skipping server",
"host", config.Host,
"port", config.Port,
"error", err,
)
continue
}
return nil, err return nil, err
} }
} }
@@ -204,3 +214,17 @@ func (multiples *MultiLDAP) Users(logins []string) (
return result, nil return result, nil
} }
// isSilentError evaluates an error and tells whenever we should fail the LDAP request
// immediately or if we should continue into other LDAP servers
func isSilentError(err error) bool {
continueErrs := []error{ErrInvalidCredentials, ErrCouldNotFindUser}
for _, cerr := range continueErrs {
if err == cerr {
return true
}
}
return false
}

View File

@@ -152,6 +152,25 @@ func TestMultiLDAP(t *testing.T) {
teardown() teardown()
}) })
Convey("Should still try to auth with the second server after receiving an invalid credentials error from the first", func() {
mock := setup()
mock.loginErrReturn = ErrInvalidCredentials
multi := New([]*ldap.ServerConfig{
{}, {},
})
_, err := multi.Login(&models.LoginUserQuery{})
So(mock.dialCalledTimes, ShouldEqual, 2)
So(mock.loginCalledTimes, ShouldEqual, 2)
So(mock.closeCalledTimes, ShouldEqual, 2)
So(err, ShouldEqual, ErrInvalidCredentials)
teardown()
})
Convey("Should return unknown error", func() { Convey("Should return unknown error", func() {
mock := setup() mock := setup()

View File

@@ -155,7 +155,13 @@ func (val *StringMapValue) Value() map[string]string {
// slices and the actual interpolation is done on all simple string values in the structure. It returns a copy of any // slices and the actual interpolation is done on all simple string values in the structure. It returns a copy of any
// map or slice value instead of modifying them in place. // map or slice value instead of modifying them in place.
func tranformInterface(i interface{}) interface{} { func tranformInterface(i interface{}) interface{} {
switch reflect.TypeOf(i).Kind() { typeOf := reflect.TypeOf(i)
if typeOf == nil {
return nil
}
switch typeOf.Kind() {
case reflect.Slice: case reflect.Slice:
return transformSlice(i.([]interface{})) return transformSlice(i.([]interface{}))
case reflect.Map: case reflect.Map:

View File

@@ -131,6 +131,8 @@ func TestValues(t *testing.T) {
- two - two
- three: - three:
inside: $STRING inside: $STRING
- six:
empty:
four: four:
nested: nested:
onemore: $INT onemore: $INT
@@ -146,11 +148,18 @@ func TestValues(t *testing.T) {
"one": 1, "one": 1,
"two": "test", "two": "test",
"three": []interface{}{ "three": []interface{}{
1, "two", anyMap{ 1,
"two",
anyMap{
"three": anyMap{ "three": anyMap{
"inside": "test", "inside": "test",
}, },
}, },
anyMap{
"six": anyMap{
"empty": interface{}(nil),
},
},
}, },
"four": anyMap{ "four": anyMap{
"nested": anyMap{ "nested": anyMap{
@@ -166,11 +175,18 @@ func TestValues(t *testing.T) {
"one": 1, "one": 1,
"two": "$STRING", "two": "$STRING",
"three": []interface{}{ "three": []interface{}{
1, "two", anyMap{ 1,
"two",
anyMap{
"three": anyMap{ "three": anyMap{
"inside": "$STRING", "inside": "$STRING",
}, },
}, },
anyMap{
"six": anyMap{
"empty": interface{}(nil),
},
},
}, },
"four": anyMap{ "four": anyMap{
"nested": anyMap{ "nested": anyMap{

View File

@@ -96,22 +96,13 @@ func roleCounterSQL(role, alias string) string {
return ` return `
( (
SELECT COUNT(*) SELECT COUNT(*)
FROM ` + dialect.Quote("user") + ` as u FROM ` + dialect.Quote("user") + ` as u, org_user
WHERE WHERE ( org_user.user_id=u.id AND org_user.role='` + role + `' )
(SELECT COUNT(*)
FROM org_user
WHERE org_user.user_id=u.id
AND org_user.role='` + role + `')>0
) as ` + alias + `, ) as ` + alias + `,
( (
SELECT COUNT(*) SELECT COUNT(*)
FROM ` + dialect.Quote("user") + ` as u FROM ` + dialect.Quote("user") + ` as u, org_user
WHERE WHERE u.last_seen_at>? AND ( org_user.user_id=u.id AND org_user.role='` + role + `' )
(SELECT COUNT(*)
FROM org_user
WHERE org_user.user_id=u.id
AND org_user.role='` + role + `')>0
AND u.last_seen_at>?
) as active_` + alias ) as active_` + alias
} }

View File

@@ -60,11 +60,7 @@ func (e *AzureMonitorDatasource) executeTimeSeriesQuery(ctx context.Context, ori
if err != nil { if err != nil {
queryRes.Error = err queryRes.Error = err
} }
if val, ok := result.Results[query.RefID]; ok { result.Results[query.RefID] = queryRes
val.Series = append(result.Results[query.RefID].Series, queryRes.Series...)
} else {
result.Results[query.RefID] = queryRes
}
} }
return result, nil return result, nil
@@ -88,22 +84,11 @@ func (e *AzureMonitorDatasource) buildQueries(queries []*tsdb.Query, timeRange *
azureMonitorTarget := query.Model.Get("azureMonitor").MustMap() azureMonitorTarget := query.Model.Get("azureMonitor").MustMap()
azlog.Debug("AzureMonitor", "target", azureMonitorTarget) azlog.Debug("AzureMonitor", "target", azureMonitorTarget)
queryMode := fmt.Sprintf("%v", azureMonitorTarget["queryMode"])
if queryMode == "crossResource" {
return nil, fmt.Errorf("Alerting not supported for multiple resource queries")
}
var azureMonitorData map[string]interface{}
if queryMode == "singleResource" {
azureMonitorData = azureMonitorTarget["data"].(map[string]interface{})[queryMode].(map[string]interface{})
} else {
azureMonitorData = azureMonitorTarget
}
urlComponents := map[string]string{} urlComponents := map[string]string{}
urlComponents["subscription"] = fmt.Sprintf("%v", query.Model.Get("subscription").MustString()) urlComponents["subscription"] = fmt.Sprintf("%v", query.Model.Get("subscription").MustString())
urlComponents["resourceGroup"] = fmt.Sprintf("%v", azureMonitorData["resourceGroup"]) urlComponents["resourceGroup"] = fmt.Sprintf("%v", azureMonitorTarget["resourceGroup"])
urlComponents["metricDefinition"] = fmt.Sprintf("%v", azureMonitorData["metricDefinition"]) urlComponents["metricDefinition"] = fmt.Sprintf("%v", azureMonitorTarget["metricDefinition"])
urlComponents["resourceName"] = fmt.Sprintf("%v", azureMonitorData["resourceName"]) urlComponents["resourceName"] = fmt.Sprintf("%v", azureMonitorTarget["resourceName"])
ub := urlBuilder{ ub := urlBuilder{
DefaultSubscription: query.DataSource.JsonData.Get("subscriptionId").MustString(), DefaultSubscription: query.DataSource.JsonData.Get("subscriptionId").MustString(),
@@ -115,12 +100,12 @@ func (e *AzureMonitorDatasource) buildQueries(queries []*tsdb.Query, timeRange *
azureURL := ub.Build() azureURL := ub.Build()
alias := "" alias := ""
if val, ok := azureMonitorData["alias"]; ok { if val, ok := azureMonitorTarget["alias"]; ok {
alias = fmt.Sprintf("%v", val) alias = fmt.Sprintf("%v", val)
} }
timeGrain := fmt.Sprintf("%v", azureMonitorData["timeGrain"]) timeGrain := fmt.Sprintf("%v", azureMonitorTarget["timeGrain"])
timeGrains := azureMonitorData["allowedTimeGrainsMs"] timeGrains := azureMonitorTarget["allowedTimeGrainsMs"]
if timeGrain == "auto" { if timeGrain == "auto" {
timeGrain, err = e.setAutoTimeGrain(query.IntervalMs, timeGrains) timeGrain, err = e.setAutoTimeGrain(query.IntervalMs, timeGrains)
if err != nil { if err != nil {
@@ -132,16 +117,13 @@ func (e *AzureMonitorDatasource) buildQueries(queries []*tsdb.Query, timeRange *
params.Add("api-version", "2018-01-01") params.Add("api-version", "2018-01-01")
params.Add("timespan", fmt.Sprintf("%v/%v", startTime.UTC().Format(time.RFC3339), endTime.UTC().Format(time.RFC3339))) params.Add("timespan", fmt.Sprintf("%v/%v", startTime.UTC().Format(time.RFC3339), endTime.UTC().Format(time.RFC3339)))
params.Add("interval", timeGrain) params.Add("interval", timeGrain)
params.Add("aggregation", fmt.Sprintf("%v", azureMonitorData["aggregation"])) params.Add("aggregation", fmt.Sprintf("%v", azureMonitorTarget["aggregation"]))
params.Add("metricnames", fmt.Sprintf("%v", azureMonitorData["metricName"])) params.Add("metricnames", fmt.Sprintf("%v", azureMonitorTarget["metricName"]))
params.Add("metricnamespace", fmt.Sprintf("%v", azureMonitorTarget["metricNamespace"]))
if val, ok := azureMonitorData["metricNamespace"]; ok { dimension := strings.TrimSpace(fmt.Sprintf("%v", azureMonitorTarget["dimension"]))
params.Add("metricnamespace", fmt.Sprintf("%v", val)) dimensionFilter := strings.TrimSpace(fmt.Sprintf("%v", azureMonitorTarget["dimensionFilter"]))
} if azureMonitorTarget["dimension"] != nil && azureMonitorTarget["dimensionFilter"] != nil && len(dimension) > 0 && len(dimensionFilter) > 0 && dimension != "None" {
dimension := strings.TrimSpace(fmt.Sprintf("%v", azureMonitorData["dimension"]))
dimensionFilter := strings.TrimSpace(fmt.Sprintf("%v", azureMonitorData["dimensionFilter"]))
if azureMonitorData["dimension"] != nil && azureMonitorData["dimensionFilter"] != nil && len(dimension) > 0 && len(dimensionFilter) > 0 && dimension != "None" {
params.Add("$filter", fmt.Sprintf("%s eq '%s'", dimension, dimensionFilter)) params.Add("$filter", fmt.Sprintf("%s eq '%s'", dimension, dimensionFilter))
} }

View File

@@ -36,20 +36,15 @@ func TestAzureMonitorDatasource(t *testing.T) {
Model: simplejson.NewFromAny(map[string]interface{}{ Model: simplejson.NewFromAny(map[string]interface{}{
"subscription": "12345678-aaaa-bbbb-cccc-123456789abc", "subscription": "12345678-aaaa-bbbb-cccc-123456789abc",
"azureMonitor": map[string]interface{}{ "azureMonitor": map[string]interface{}{
"queryMode": "singleResource", "timeGrain": "PT1M",
"data": map[string]interface{}{ "aggregation": "Average",
"singleResource": map[string]interface{}{ "resourceGroup": "grafanastaging",
"timeGrain": "PT1M", "resourceName": "grafana",
"aggregation": "Average", "metricDefinition": "Microsoft.Compute/virtualMachines",
"resourceGroup": "grafanastaging", "metricNamespace": "Microsoft.Compute-virtualMachines",
"resourceName": "grafana", "metricName": "Percentage CPU",
"metricDefinition": "Microsoft.Compute/virtualMachines", "alias": "testalias",
"metricNamespace": "Microsoft.Compute-virtualMachines", "queryType": "Azure Monitor",
"metricName": "Percentage CPU",
"alias": "testalias",
"queryType": "Azure Monitor",
},
},
}, },
}), }),
RefId: "A", RefId: "A",

View File

@@ -155,7 +155,7 @@ func init() {
"AWS/Events": {"RuleName"}, "AWS/Events": {"RuleName"},
"AWS/FSx": {}, "AWS/FSx": {},
"AWS/Firehose": {"DeliveryStreamName"}, "AWS/Firehose": {"DeliveryStreamName"},
"AWS/GameLift": {"FleetId", "InstanceType", "MatchmakingConfigurationName", "MatchmakingConfigurationName-RuleName", "MetricGroup", "OperatingSystem", "QueueName"}, "AWS/GameLift": {"FleetId", "InstanceType", "MatchmakingConfigurationName", "MatchmakingConfigurationName-RuleName", "MetricGroups", "OperatingSystem", "QueueName"},
"AWS/Glue": {"JobName", "JobRunId", "Type"}, "AWS/Glue": {"JobName", "JobRunId", "Type"},
"AWS/Inspector": {}, "AWS/Inspector": {},
"AWS/IoT": {"ActionType", "BehaviorName", "CheckName", "JobId", "Protocol", "RuleName", "ScheduledAuditName", "SecurityProfileName"}, "AWS/IoT": {"ActionType", "BehaviorName", "CheckName", "JobId", "Protocol", "RuleName", "ScheduledAuditName", "SecurityProfileName"},
@@ -179,7 +179,7 @@ func init() {
"AWS/OpsWorks": {"InstanceId", "LayerId", "StackId"}, "AWS/OpsWorks": {"InstanceId", "LayerId", "StackId"},
"AWS/Polly": {"Operation"}, "AWS/Polly": {"Operation"},
"AWS/RDS": {"DBClusterIdentifier", "DBInstanceIdentifier", "DatabaseClass", "DbClusterIdentifier", "EngineName", "Role", "SourceRegion"}, "AWS/RDS": {"DBClusterIdentifier", "DBInstanceIdentifier", "DatabaseClass", "DbClusterIdentifier", "EngineName", "Role", "SourceRegion"},
"AWS/Redshift": {"ClusterIdentifier", "NodeID", "Service class", "Stage", "latency", "wmlid"}, "AWS/Redshift": {"ClusterIdentifier", "NodeID", "Service class", "Stage", "latency", "wlmid"},
"AWS/Route53": {"HealthCheckId", "Region"}, "AWS/Route53": {"HealthCheckId", "Region"},
"AWS/S3": {"BucketName", "FilterId", "StorageType"}, "AWS/S3": {"BucketName", "FilterId", "StorageType"},
"AWS/SES": {}, "AWS/SES": {},

View File

@@ -3,7 +3,6 @@ package mssql
import ( import (
"database/sql" "database/sql"
"fmt" "fmt"
"net/url"
"strconv" "strconv"
"github.com/grafana/grafana/pkg/setting" "github.com/grafana/grafana/pkg/setting"
@@ -24,7 +23,10 @@ func init() {
func newMssqlQueryEndpoint(datasource *models.DataSource) (tsdb.TsdbQueryEndpoint, error) { func newMssqlQueryEndpoint(datasource *models.DataSource) (tsdb.TsdbQueryEndpoint, error) {
logger := log.New("tsdb.mssql") logger := log.New("tsdb.mssql")
cnnstr := generateConnectionString(datasource) cnnstr, err := generateConnectionString(datasource)
if err != nil {
return nil, err
}
if setting.Env == setting.DEV { if setting.Env == setting.DEV {
logger.Debug("getEngine", "connection", cnnstr) logger.Debug("getEngine", "connection", cnnstr)
} }
@@ -36,35 +38,35 @@ func newMssqlQueryEndpoint(datasource *models.DataSource) (tsdb.TsdbQueryEndpoin
MetricColumnTypes: []string{"VARCHAR", "CHAR", "NVARCHAR", "NCHAR"}, MetricColumnTypes: []string{"VARCHAR", "CHAR", "NVARCHAR", "NCHAR"},
} }
rowTransformer := mssqlRowTransformer{ queryResultTransformer := mssqlQueryResultTransformer{
log: logger, log: logger,
} }
return sqleng.NewSqlQueryEndpoint(&config, &rowTransformer, newMssqlMacroEngine(), logger) return sqleng.NewSqlQueryEndpoint(&config, &queryResultTransformer, newMssqlMacroEngine(), logger)
} }
func generateConnectionString(datasource *models.DataSource) string { func generateConnectionString(datasource *models.DataSource) (string, error) {
server, port := util.SplitHostPortDefault(datasource.Url, "localhost", "1433") server, port := util.SplitHostPortDefault(datasource.Url, "localhost", "1433")
encrypt := datasource.JsonData.Get("encrypt").MustString("false") encrypt := datasource.JsonData.Get("encrypt").MustString("false")
connStr := fmt.Sprintf("server=%s;port=%s;database=%s;user id=%s;password=%s;",
query := url.Values{} server,
query.Add("database", datasource.Database) port,
query.Add("encrypt", encrypt) datasource.Database,
datasource.User,
u := &url.URL{ datasource.DecryptedPassword(),
Scheme: "sqlserver", )
User: url.UserPassword(datasource.User, datasource.DecryptedPassword()), if encrypt != "false" {
Host: fmt.Sprintf("%s:%s", server, port), connStr += fmt.Sprintf("encrypt=%s;", encrypt)
RawQuery: query.Encode(),
} }
return u.String() return connStr, nil
} }
type mssqlRowTransformer struct { type mssqlQueryResultTransformer struct {
log log.Logger log log.Logger
} }
func (t *mssqlRowTransformer) Transform(columnTypes []*sql.ColumnType, rows *core.Rows) (tsdb.RowValues, error) { func (t *mssqlQueryResultTransformer) TransformQueryResult(columnTypes []*sql.ColumnType, rows *core.Rows) (tsdb.RowValues, error) {
values := make([]interface{}, len(columnTypes)) values := make([]interface{}, len(columnTypes))
valuePtrs := make([]interface{}, len(columnTypes)) valuePtrs := make([]interface{}, len(columnTypes))
@@ -98,3 +100,7 @@ func (t *mssqlRowTransformer) Transform(columnTypes []*sql.ColumnType, rows *cor
return values, nil return values, nil
} }
func (t *mssqlQueryResultTransformer) TransformQueryError(err error) error {
return err
}

View File

@@ -28,33 +28,6 @@ import (
// If needed, change the variable below to the IP address of the database. // If needed, change the variable below to the IP address of the database.
var serverIP = "localhost" var serverIP = "localhost"
func TestGenerateConnectionString(t *testing.T) {
encrypted, _ := simplejson.NewJson([]byte(`{"encrypt":"false"}`))
testSet := []struct {
ds *models.DataSource
expected string
}{
{
&models.DataSource{
User: "user",
Database: "db",
Url: "localhost:1433",
SecureJsonData: securejsondata.GetEncryptedJsonData(map[string]string{
"password": "pass;word",
}),
JsonData: encrypted,
},
"sqlserver://user:pass;word@localhost:1433?database=db&encrypt=false",
},
}
for i := range testSet {
got := generateConnectionString(testSet[i].ds)
if got != testSet[i].expected {
t.Errorf("mssql connString error for testCase %d got: %s expected: %s", i, got, testSet[i].expected)
}
}
}
func TestMSSQL(t *testing.T) { func TestMSSQL(t *testing.T) {
SkipConvey("MSSQL", t, func() { SkipConvey("MSSQL", t, func() {
x := InitMSSQLTestDB(t) x := InitMSSQLTestDB(t)

View File

@@ -1,11 +1,13 @@
package mysql package mysql
import ( import (
"errors"
"fmt" "fmt"
"regexp" "regexp"
"strings" "strings"
"github.com/grafana/grafana/pkg/components/gtime" "github.com/grafana/grafana/pkg/components/gtime"
"github.com/grafana/grafana/pkg/infra/log"
"github.com/grafana/grafana/pkg/tsdb" "github.com/grafana/grafana/pkg/tsdb"
"github.com/grafana/grafana/pkg/tsdb/sqleng" "github.com/grafana/grafana/pkg/tsdb/sqleng"
) )
@@ -13,19 +15,29 @@ import (
const rsIdentifier = `([_a-zA-Z0-9]+)` const rsIdentifier = `([_a-zA-Z0-9]+)`
const sExpr = `\$` + rsIdentifier + `\(([^\)]*)\)` const sExpr = `\$` + rsIdentifier + `\(([^\)]*)\)`
var restrictedRegExp = regexp.MustCompile(`(?im)([\s]*show[\s]+grants|[\s,]session_user\([^\)]*\)|[\s,]current_user(\([^\)]*\))?|[\s,]system_user\([^\)]*\)|[\s,]user\([^\)]*\))([\s,;]|$)`)
type mySqlMacroEngine struct { type mySqlMacroEngine struct {
*sqleng.SqlMacroEngineBase *sqleng.SqlMacroEngineBase
timeRange *tsdb.TimeRange timeRange *tsdb.TimeRange
query *tsdb.Query query *tsdb.Query
logger log.Logger
} }
func newMysqlMacroEngine() sqleng.SqlMacroEngine { func newMysqlMacroEngine(logger log.Logger) sqleng.SqlMacroEngine {
return &mySqlMacroEngine{SqlMacroEngineBase: sqleng.NewSqlMacroEngineBase()} return &mySqlMacroEngine{SqlMacroEngineBase: sqleng.NewSqlMacroEngineBase(), logger: logger}
} }
func (m *mySqlMacroEngine) Interpolate(query *tsdb.Query, timeRange *tsdb.TimeRange, sql string) (string, error) { func (m *mySqlMacroEngine) Interpolate(query *tsdb.Query, timeRange *tsdb.TimeRange, sql string) (string, error) {
m.timeRange = timeRange m.timeRange = timeRange
m.query = query m.query = query
matches := restrictedRegExp.FindAllStringSubmatch(sql, 1)
if len(matches) > 0 {
m.logger.Error("show grants, session_user(), current_user(), system_user() or user() not allowed in query")
return "", errors.New("Invalid query. Inspect Grafana server log for details")
}
rExp, _ := regexp.Compile(sExpr) rExp, _ := regexp.Compile(sExpr)
var macroError error var macroError error

View File

@@ -6,13 +6,16 @@ import (
"testing" "testing"
"time" "time"
"github.com/grafana/grafana/pkg/infra/log"
"github.com/grafana/grafana/pkg/tsdb" "github.com/grafana/grafana/pkg/tsdb"
. "github.com/smartystreets/goconvey/convey" . "github.com/smartystreets/goconvey/convey"
) )
func TestMacroEngine(t *testing.T) { func TestMacroEngine(t *testing.T) {
Convey("MacroEngine", t, func() { Convey("MacroEngine", t, func() {
engine := &mySqlMacroEngine{} engine := &mySqlMacroEngine{
logger: log.New("test"),
}
query := &tsdb.Query{} query := &tsdb.Query{}
Convey("Given a time range between 2018-04-12 00:00 and 2018-04-12 00:05", func() { Convey("Given a time range between 2018-04-12 00:00 and 2018-04-12 00:05", func() {
@@ -157,5 +160,33 @@ func TestMacroEngine(t *testing.T) {
So(sql, ShouldEqual, fmt.Sprintf("select time >= %d AND time <= %d", from.Unix(), to.Unix())) So(sql, ShouldEqual, fmt.Sprintf("select time >= %d AND time <= %d", from.Unix(), to.Unix()))
}) })
}) })
Convey("Given queries that contains unallowed user functions", func() {
tcs := []string{
"select \nSESSION_USER(), abc",
"SELECT session_User( ) ",
"SELECT session_User( )\n",
"SELECT current_user",
"SELECT current_USER",
"SELECT current_user()",
"SELECT Current_User()",
"SELECT current_user( )",
"SELECT current_user(\t )",
"SELECT user()",
"SELECT USER()",
"SELECT SYSTEM_USER()",
"SELECT System_User()",
"SELECT System_User( )",
"SELECT System_User(\t \t)",
"SHOW \t grants",
" show Grants\n",
"show grants;",
}
for _, tc := range tcs {
_, err := engine.Interpolate(nil, nil, tc)
So(err.Error(), ShouldEqual, "Invalid query. Inspect Grafana server log for details")
}
})
}) })
} }

View File

@@ -2,11 +2,14 @@ package mysql
import ( import (
"database/sql" "database/sql"
"errors"
"fmt" "fmt"
"reflect" "reflect"
"strconv" "strconv"
"strings" "strings"
"github.com/VividCortex/mysqlerr"
"github.com/grafana/grafana/pkg/setting" "github.com/grafana/grafana/pkg/setting"
"github.com/go-sql-driver/mysql" "github.com/go-sql-driver/mysql"
@@ -59,18 +62,18 @@ func newMysqlQueryEndpoint(datasource *models.DataSource) (tsdb.TsdbQueryEndpoin
MetricColumnTypes: []string{"CHAR", "VARCHAR", "TINYTEXT", "TEXT", "MEDIUMTEXT", "LONGTEXT"}, MetricColumnTypes: []string{"CHAR", "VARCHAR", "TINYTEXT", "TEXT", "MEDIUMTEXT", "LONGTEXT"},
} }
rowTransformer := mysqlRowTransformer{ rowTransformer := mysqlQueryResultTransformer{
log: logger, log: logger,
} }
return sqleng.NewSqlQueryEndpoint(&config, &rowTransformer, newMysqlMacroEngine(), logger) return sqleng.NewSqlQueryEndpoint(&config, &rowTransformer, newMysqlMacroEngine(logger), logger)
} }
type mysqlRowTransformer struct { type mysqlQueryResultTransformer struct {
log log.Logger log log.Logger
} }
func (t *mysqlRowTransformer) Transform(columnTypes []*sql.ColumnType, rows *core.Rows) (tsdb.RowValues, error) { func (t *mysqlQueryResultTransformer) TransformQueryResult(columnTypes []*sql.ColumnType, rows *core.Rows) (tsdb.RowValues, error) {
values := make([]interface{}, len(columnTypes)) values := make([]interface{}, len(columnTypes))
for i := range values { for i := range values {
@@ -128,3 +131,16 @@ func (t *mysqlRowTransformer) Transform(columnTypes []*sql.ColumnType, rows *cor
return values, nil return values, nil
} }
func (t *mysqlQueryResultTransformer) TransformQueryError(err error) error {
if driverErr, ok := err.(*mysql.MySQLError); ok {
if driverErr.Number != mysqlerr.ER_PARSE_ERROR && driverErr.Number != mysqlerr.ER_BAD_FIELD_ERROR && driverErr.Number != mysqlerr.ER_NO_SUCH_TABLE {
t.log.Error("query error", "err", err)
return errQueryFailed
}
}
return err
}
var errQueryFailed = errors.New("Query failed. Please inspect Grafana server log for details")

View File

@@ -33,13 +33,13 @@ func newPostgresQueryEndpoint(datasource *models.DataSource) (tsdb.TsdbQueryEndp
MetricColumnTypes: []string{"UNKNOWN", "TEXT", "VARCHAR", "CHAR"}, MetricColumnTypes: []string{"UNKNOWN", "TEXT", "VARCHAR", "CHAR"},
} }
rowTransformer := postgresRowTransformer{ queryResultTransformer := postgresQueryResultTransformer{
log: logger, log: logger,
} }
timescaledb := datasource.JsonData.Get("timescaledb").MustBool(false) timescaledb := datasource.JsonData.Get("timescaledb").MustBool(false)
return sqleng.NewSqlQueryEndpoint(&config, &rowTransformer, newPostgresMacroEngine(timescaledb), logger) return sqleng.NewSqlQueryEndpoint(&config, &queryResultTransformer, newPostgresMacroEngine(timescaledb), logger)
} }
func generateConnectionString(datasource *models.DataSource) string { func generateConnectionString(datasource *models.DataSource) string {
@@ -54,11 +54,11 @@ func generateConnectionString(datasource *models.DataSource) string {
return u.String() return u.String()
} }
type postgresRowTransformer struct { type postgresQueryResultTransformer struct {
log log.Logger log log.Logger
} }
func (t *postgresRowTransformer) Transform(columnTypes []*sql.ColumnType, rows *core.Rows) (tsdb.RowValues, error) { func (t *postgresQueryResultTransformer) TransformQueryResult(columnTypes []*sql.ColumnType, rows *core.Rows) (tsdb.RowValues, error) {
values := make([]interface{}, len(columnTypes)) values := make([]interface{}, len(columnTypes))
valuePtrs := make([]interface{}, len(columnTypes)) valuePtrs := make([]interface{}, len(columnTypes))
@@ -93,3 +93,7 @@ func (t *postgresRowTransformer) Transform(columnTypes []*sql.ColumnType, rows *
return values, nil return values, nil
} }
func (t *postgresQueryResultTransformer) TransformQueryError(err error) error {
return err
}

View File

@@ -12,6 +12,8 @@ import (
"sync" "sync"
"time" "time"
"github.com/grafana/grafana/pkg/setting"
"github.com/grafana/grafana/pkg/infra/log" "github.com/grafana/grafana/pkg/infra/log"
"github.com/grafana/grafana/pkg/tsdb" "github.com/grafana/grafana/pkg/tsdb"
@@ -29,9 +31,12 @@ type SqlMacroEngine interface {
Interpolate(query *tsdb.Query, timeRange *tsdb.TimeRange, sql string) (string, error) Interpolate(query *tsdb.Query, timeRange *tsdb.TimeRange, sql string) (string, error)
} }
// SqlTableRowTransformer transforms a query result row to RowValues with proper types. // SqlQueryResultTransformer transforms a query result row to RowValues with proper types.
type SqlTableRowTransformer interface { type SqlQueryResultTransformer interface {
Transform(columnTypes []*sql.ColumnType, rows *core.Rows) (tsdb.RowValues, error) // TransformQueryResult transforms a query result row to RowValues with proper types.
TransformQueryResult(columnTypes []*sql.ColumnType, rows *core.Rows) (tsdb.RowValues, error)
// TransformQueryError transforms a query error.
TransformQueryError(err error) error
} }
type engineCacheType struct { type engineCacheType struct {
@@ -52,12 +57,12 @@ var NewXormEngine = func(driverName string, connectionString string) (*xorm.Engi
} }
type sqlQueryEndpoint struct { type sqlQueryEndpoint struct {
macroEngine SqlMacroEngine macroEngine SqlMacroEngine
rowTransformer SqlTableRowTransformer queryResultTransformer SqlQueryResultTransformer
engine *xorm.Engine engine *xorm.Engine
timeColumnNames []string timeColumnNames []string
metricColumnTypes []string metricColumnTypes []string
log log.Logger log log.Logger
} }
type SqlQueryEndpointConfiguration struct { type SqlQueryEndpointConfiguration struct {
@@ -68,12 +73,12 @@ type SqlQueryEndpointConfiguration struct {
MetricColumnTypes []string MetricColumnTypes []string
} }
var NewSqlQueryEndpoint = func(config *SqlQueryEndpointConfiguration, rowTransformer SqlTableRowTransformer, macroEngine SqlMacroEngine, log log.Logger) (tsdb.TsdbQueryEndpoint, error) { var NewSqlQueryEndpoint = func(config *SqlQueryEndpointConfiguration, queryResultTransformer SqlQueryResultTransformer, macroEngine SqlMacroEngine, log log.Logger) (tsdb.TsdbQueryEndpoint, error) {
queryEndpoint := sqlQueryEndpoint{ queryEndpoint := sqlQueryEndpoint{
rowTransformer: rowTransformer, queryResultTransformer: queryResultTransformer,
macroEngine: macroEngine, macroEngine: macroEngine,
timeColumnNames: []string{"time"}, timeColumnNames: []string{"time"},
log: log, log: log,
} }
if len(config.TimeColumnNames) > 0 { if len(config.TimeColumnNames) > 0 {
@@ -158,7 +163,7 @@ func (e *sqlQueryEndpoint) Query(ctx context.Context, dsInfo *models.DataSource,
rows, err := db.Query(rawSQL) rows, err := db.Query(rawSQL)
if err != nil { if err != nil {
queryResult.Error = err queryResult.Error = e.queryResultTransformer.TransformQueryError(err)
return return
} }
@@ -240,7 +245,7 @@ func (e *sqlQueryEndpoint) transformToTable(query *tsdb.Query, rows *core.Rows,
return fmt.Errorf("query row limit exceeded, limit %d", rowLimit) return fmt.Errorf("query row limit exceeded, limit %d", rowLimit)
} }
values, err := e.rowTransformer.Transform(columnTypes, rows) values, err := e.queryResultTransformer.TransformQueryResult(columnTypes, rows)
if err != nil { if err != nil {
return err return err
} }
@@ -338,7 +343,7 @@ func (e *sqlQueryEndpoint) transformToTimeSeries(query *tsdb.Query, rows *core.R
return fmt.Errorf("query row limit exceeded, limit %d", rowLimit) return fmt.Errorf("query row limit exceeded, limit %d", rowLimit)
} }
values, err := e.rowTransformer.Transform(columnTypes, rows) values, err := e.queryResultTransformer.TransformQueryResult(columnTypes, rows)
if err != nil { if err != nil {
return err return err
} }
@@ -418,7 +423,9 @@ func (e *sqlQueryEndpoint) transformToTimeSeries(query *tsdb.Query, rows *core.R
series.Points = append(series.Points, tsdb.TimePoint{value, null.FloatFrom(timestamp)}) series.Points = append(series.Points, tsdb.TimePoint{value, null.FloatFrom(timestamp)})
e.log.Debug("Rows", "metric", metric, "time", timestamp, "value", value) if setting.Env == setting.DEV {
e.log.Debug("Rows", "metric", metric, "time", timestamp, "value", value)
}
} }
} }

View File

@@ -41,24 +41,7 @@ export const LoginPage: FC = () => {
/> />
) : null} ) : null}
{isOauthEnabled ? ( <LoginServiceButtons />
<>
<div className="text-center login-divider">
<div>
<div className="login-divider-line" />
</div>
<div>
<span className="login-divider-text">{disableLoginForm ? null : <span>or</span>}</span>
</div>
<div>
<div className="login-divider-line" />
</div>
</div>
<div className="clearfix" />
<LoginServiceButtons />
</>
) : null}
{!disableUserSignUp ? <UserSignup /> : null} {!disableUserSignUp ? <UserSignup /> : null}
</div> </div>
<CSSTransition <CSSTransition

View File

@@ -46,11 +46,39 @@ export interface LoginServices {
[key: string]: LoginService; [key: string]: LoginService;
} }
const LoginDivider = () => {
return (
<>
<div className="text-center login-divider">
<div>
<div className="login-divider-line" />
</div>
<div>
<span className="login-divider-text">{config.disableLoginForm ? null : <span>or</span>}</span>
</div>
<div>
<div className="login-divider-line" />
</div>
</div>
<div className="clearfix" />
</>
);
};
export const LoginServiceButtons = () => { export const LoginServiceButtons = () => {
const keyNames = Object.keys(loginServices()); const keyNames = Object.keys(loginServices());
const serviceElements = keyNames.map(key => { const serviceElementsEnabled = keyNames.filter(key => {
const service: LoginService = loginServices()[key]; const service: LoginService = loginServices()[key];
return service.enabled ? ( return service.enabled;
});
if (serviceElementsEnabled.length === 0) {
return null;
}
const serviceElements = serviceElementsEnabled.map(key => {
const service: LoginService = loginServices()[key];
return (
<a <a
key={key} key={key}
className={`btn btn-medium btn-service btn-service--${service.className || key} login-btn`} className={`btn btn-medium btn-service btn-service--${service.className || key} login-btn`}
@@ -60,8 +88,14 @@ export const LoginServiceButtons = () => {
<i className={`btn-service-icon fa fa-${service.icon ? service.icon : key}`} /> <i className={`btn-service-icon fa fa-${service.icon ? service.icon : key}`} />
Sign in with {service.name} Sign in with {service.name}
</a> </a>
) : null; );
}); });
return <div className="login-oauth text-center">{serviceElements}</div>; const divider = LoginDivider();
return (
<>
{divider}
<div className="login-oauth text-center">{serviceElements}</div>
</>
);
}; };

View File

@@ -1,9 +1,9 @@
import $ from 'jquery';
import _ from 'lodash'; import _ from 'lodash';
import coreModule from 'app/core/core_module'; import coreModule from 'app/core/core_module';
import appEvents from 'app/core/app_events'; import appEvents from 'app/core/app_events';
import { getExploreUrl } from 'app/core/utils/explore'; import { getExploreUrl } from 'app/core/utils/explore';
import locationUtil from 'app/core/utils/location_util';
import { store } from 'app/store/store'; import { store } from 'app/store/store';
import Mousetrap from 'mousetrap'; import Mousetrap from 'mousetrap';
@@ -47,9 +47,36 @@ export class KeybindingSrv {
this.bind('s o', this.openSearch); this.bind('s o', this.openSearch);
this.bind('f', this.openSearch); this.bind('f', this.openSearch);
this.bind('esc', this.exit); this.bind('esc', this.exit);
this.bindGlobal('esc', this.globalEsc);
} }
} }
globalEsc() {
const anyDoc = document as any;
const activeElement = anyDoc.activeElement;
// typehead needs to handle it
const typeaheads = document.querySelectorAll('.slate-typeahead--open');
if (typeaheads.length > 0) {
return;
}
// second check if we are in an input we can blur
if (activeElement && activeElement.blur) {
if (
activeElement.nodeName === 'INPUT' ||
activeElement.nodeName === 'TEXTAREA' ||
activeElement.hasAttribute('data-slate-editor')
) {
anyDoc.activeElement.blur();
return;
}
}
// ok no focused input or editor that should block this, let exist!
this.exit();
}
openSearch() { openSearch() {
appEvents.emit('show-dash-search'); appEvents.emit('show-dash-search');
} }
@@ -71,11 +98,6 @@ export class KeybindingSrv {
} }
exit() { exit() {
const popups = $('.popover.in, .slate-typeahead');
if (popups.length > 0) {
return;
}
appEvents.emit('hide-modal'); appEvents.emit('hide-modal');
if (this.modalOpen) { if (this.modalOpen) {
@@ -199,8 +221,10 @@ export class KeybindingSrv {
const panel = dashboard.getPanelById(dashboard.meta.focusPanelId); const panel = dashboard.getPanelById(dashboard.meta.focusPanelId);
const datasource = await this.datasourceSrv.get(panel.datasource); const datasource = await this.datasourceSrv.get(panel.datasource);
const url = await getExploreUrl(panel, panel.targets, datasource, this.datasourceSrv, this.timeSrv); const url = await getExploreUrl(panel, panel.targets, datasource, this.datasourceSrv, this.timeSrv);
if (url) { const urlWithoutBase = locationUtil.stripBaseFromUrl(url);
this.$timeout(() => this.$location.url(url));
if (urlWithoutBase) {
this.$timeout(() => this.$location.url(urlWithoutBase));
} }
} }
}); });

View File

@@ -91,7 +91,8 @@ export async function getExploreUrl(
const exploreState = JSON.stringify({ ...state, originPanelId: panel.id }); const exploreState = JSON.stringify({ ...state, originPanelId: panel.id });
url = renderUrl('/explore', { left: exploreState }); url = renderUrl('/explore', { left: exploreState });
} }
return url; const finalUrl = config.appSubUrl + url;
return finalUrl;
} }
export function buildQueryTransaction( export function buildQueryTransaction(

View File

@@ -89,12 +89,12 @@ export class LdapPage extends PureComponent<Props, State> {
{config.buildInfo.isEnterprise && ldapSyncInfo && <LdapSyncInfo ldapSyncInfo={ldapSyncInfo} />} {config.buildInfo.isEnterprise && ldapSyncInfo && <LdapSyncInfo ldapSyncInfo={ldapSyncInfo} />}
<h3 className="page-heading">User mapping</h3> <h3 className="page-heading">Test user mapping</h3>
<div className="gf-form-group"> <div className="gf-form-group">
<form onSubmit={this.search} className="gf-form-inline"> <form onSubmit={this.search} className="gf-form-inline">
<FormField label="User name" labelWidth={8} inputWidth={30} type="text" id="username" name="username" /> <FormField label="Username" labelWidth={8} inputWidth={30} type="text" id="username" name="username" />
<button type="submit" className="btn btn-primary"> <button type="submit" className="btn btn-primary">
Test LDAP mapping Run
</button> </button>
</form> </form>
</div> </div>

View File

@@ -9,7 +9,6 @@ interface Props {
export const LdapUserGroups: FC<Props> = ({ groups, showAttributeMapping }) => { export const LdapUserGroups: FC<Props> = ({ groups, showAttributeMapping }) => {
const items = showAttributeMapping ? groups : groups.filter(item => item.orgRole); const items = showAttributeMapping ? groups : groups.filter(item => item.orgRole);
const roleColumnClass = showAttributeMapping && 'width-14';
return ( return (
<div className="gf-form-group"> <div className="gf-form-group">
@@ -17,32 +16,39 @@ export const LdapUserGroups: FC<Props> = ({ groups, showAttributeMapping }) => {
<table className="filter-table form-inline"> <table className="filter-table form-inline">
<thead> <thead>
<tr> <tr>
{showAttributeMapping && <th>LDAP Group</th>}
<th>Organisation</th> <th>Organisation</th>
<th>Role</th> <th>Role</th>
{showAttributeMapping && <th colSpan={2}>LDAP Group</th>}
</tr> </tr>
</thead> </thead>
<tbody> <tbody>
{items.map((group, index) => { {items.map((group, index) => {
return ( return (
<tr key={`${group.orgId}-${index}`}> <tr key={`${group.orgId}-${index}`}>
<td className="width-16">{group.orgName}</td>
<td className={roleColumnClass}>{group.orgRole}</td>
{showAttributeMapping && ( {showAttributeMapping && (
<> <>
<td>{group.groupDN}</td> <td>{group.groupDN}</td>
<td> {!group.orgRole && (
{!group.orgRole && ( <>
<span className="text-warning pull-right"> <td />
No match <td>
<Tooltip placement="top" content="No matching groups found" theme={'info'}> <span className="text-warning">
<div className="gf-form-help-icon gf-form-help-icon--right-normal"> No match
<i className="fa fa-info-circle" /> <Tooltip placement="top" content="No matching groups found" theme={'info'}>
</div> <span className="gf-form-help-icon">
</Tooltip> <i className="fa fa-info-circle" />
</span> </span>
)} </Tooltip>
</td> </span>
</td>
</>
)}
</>
)}
{group.orgName && (
<>
<td>{group.orgName}</td>
<td>{group.orgRole}</td>
</> </>
)} )}
</tr> </tr>

View File

@@ -18,8 +18,21 @@ export const LdapUserInfo: FC<Props> = ({ ldapUser, showAttributeMapping }) => {
{ldapUser.roles && ldapUser.roles.length > 0 && ( {ldapUser.roles && ldapUser.roles.length > 0 && (
<LdapUserGroups groups={ldapUser.roles} showAttributeMapping={showAttributeMapping} /> <LdapUserGroups groups={ldapUser.roles} showAttributeMapping={showAttributeMapping} />
)} )}
{ldapUser.teams && ldapUser.teams.length > 0 && (
{ldapUser.teams && ldapUser.teams.length > 0 ? (
<LdapUserTeams teams={ldapUser.teams} showAttributeMapping={showAttributeMapping} /> <LdapUserTeams teams={ldapUser.teams} showAttributeMapping={showAttributeMapping} />
) : (
<div className="gf-form-group">
<div className="gf-form">
<table className="filter-table form-inline">
<tbody>
<tr>
<td>No teams found via LDAP</td>
</tr>
</tbody>
</table>
</div>
</div>
)} )}
</> </>
); );

View File

@@ -1,5 +1,4 @@
import React, { FC } from 'react'; import React, { FC } from 'react';
import { css } from 'emotion';
import { Tooltip } from '@grafana/ui'; import { Tooltip } from '@grafana/ui';
import { LdapTeam } from 'app/types'; import { LdapTeam } from 'app/types';
@@ -10,10 +9,6 @@ interface Props {
export const LdapUserTeams: FC<Props> = ({ teams, showAttributeMapping }) => { export const LdapUserTeams: FC<Props> = ({ teams, showAttributeMapping }) => {
const items = showAttributeMapping ? teams : teams.filter(item => item.teamName); const items = showAttributeMapping ? teams : teams.filter(item => item.teamName);
const teamColumnClass = showAttributeMapping && 'width-14';
const noMatchPlaceholderStyle = css`
display: flex;
`;
return ( return (
<div className="gf-form-group"> <div className="gf-form-group">
@@ -21,29 +16,41 @@ export const LdapUserTeams: FC<Props> = ({ teams, showAttributeMapping }) => {
<table className="filter-table form-inline"> <table className="filter-table form-inline">
<thead> <thead>
<tr> <tr>
{showAttributeMapping && <th>LDAP Group</th>}
<th>Organisation</th> <th>Organisation</th>
<th>Team</th> <th>Team</th>
{showAttributeMapping && <th>LDAP</th>}
</tr> </tr>
</thead> </thead>
<tbody> <tbody>
{items.map((team, index) => { {items.map((team, index) => {
return ( return (
<tr key={`${team.teamName}-${index}`}> <tr key={`${team.teamName}-${index}`}>
<td className="width-16"> {showAttributeMapping && (
{team.orgName || ( <>
<div className={`text-warning ${noMatchPlaceholderStyle}`}> <td>{team.groupDN}</td>
No match {!team.orgName && (
<Tooltip placement="top" content="No matching teams found" theme={'info'}> <>
<div className="gf-form-help-icon gf-form-help-icon--right-normal"> <td />
<i className="fa fa-info-circle" /> <td>
</div> <div className="text-warning">
</Tooltip> No match
</div> <Tooltip placement="top" content="No matching teams found" theme={'info'}>
)} <span className="gf-form-help-icon">
</td> <i className="fa fa-info-circle" />
<td className={teamColumnClass}>{team.teamName}</td> </span>
{showAttributeMapping && <td>{team.groupDN}</td>} </Tooltip>
</div>
</td>
</>
)}
</>
)}
{team.orgName && (
<>
<td>{team.orgName}</td>
<td>{team.teamName}</td>
</>
)}
</tr> </tr>
); );
})} })}

View File

@@ -30,27 +30,27 @@
<tbody> <tbody>
<tr ng-repeat="user in ctrl.users"> <tr ng-repeat="user in ctrl.users">
<td class="width-4 text-center link-td"> <td class="width-4 text-center link-td">
<a href="admin/users/{{user.authLabel === 'LDAP' ? 'ldap/' : ''}}edit/{{user.id}}"> <a href="admin/users/edit/{{user.id}}">
<img class="filter-table__avatar" ng-src="{{user.avatarUrl}}"></img> <img class="filter-table__avatar" ng-src="{{user.avatarUrl}}"></img>
</a> </a>
</td> </td>
<td class="link-td"> <td class="link-td">
<a href="admin/users/{{user.authLabel === 'LDAP' ? 'ldap/' : ''}}edit/{{user.id}}"> <a href="admin/users/edit/{{user.id}}">
{{user.login}} {{user.login}}
</a> </a>
</td> </td>
<td class="link-td"> <td class="link-td">
<a href="admin/users/{{user.authLabel === 'LDAP' ? 'ldap/' : ''}}edit/{{user.id}}"> <a href="admin/users/edit/{{user.id}}">
{{user.email}} {{user.email}}
</a> </a>
</td> </td>
<td class="link-td"> <td class="link-td">
<a href="admin/users/{{user.authLabel === 'LDAP' ? 'ldap/' : ''}}edit/{{user.id}}"> <a href="admin/users/edit/{{user.id}}">
{{user.lastSeenAtAge}} {{user.lastSeenAtAge}}
</a> </a>
</td> </td>
<td class="link-td"> <td class="link-td">
<a href="admin/users/{{user.authLabel === 'LDAP' ? 'ldap/' : ''}}edit/{{user.id}}"> <a href="admin/users/edit/{{user.id}}">
<i class="fa fa-shield" ng-show="user.isAdmin" bs-tooltip="'Grafana Admin'"></i> <i class="fa fa-shield" ng-show="user.isAdmin" bs-tooltip="'Grafana Admin'"></i>
</a> </a>
</td> </td>

View File

@@ -38,6 +38,7 @@ export function loadLdapState(): ThunkResult<void> {
const connectionInfo = await getLdapState(); const connectionInfo = await getLdapState();
dispatch(ldapConnectionInfoLoadedAction(connectionInfo)); dispatch(ldapConnectionInfoLoadedAction(connectionInfo));
} catch (error) { } catch (error) {
error.isHandled = true;
const ldapError = { const ldapError = {
title: error.data.message, title: error.data.message,
body: error.data.error, body: error.data.error,
@@ -63,6 +64,7 @@ export function loadUserMapping(username: string): ThunkResult<void> {
const userInfo = await getUserInfo(username); const userInfo = await getUserInfo(username);
dispatch(userMappingInfoLoadedAction(userInfo)); dispatch(userMappingInfoLoadedAction(userInfo));
} catch (error) { } catch (error) {
error.isHandled = true;
const userError = { const userError = {
title: error.data.message, title: error.data.message,
body: error.data.error, body: error.data.error,
@@ -106,6 +108,7 @@ export function loadLdapUserInfo(userId: number): ThunkResult<void> {
dispatch(loadUserSessions(userId)); dispatch(loadUserSessions(userId));
dispatch(loadUserMapping(user.login)); dispatch(loadUserMapping(user.login));
} catch (error) { } catch (error) {
error.isHandled = true;
const userError = { const userError = {
title: error.data.message, title: error.data.message,
body: error.data.error, body: error.data.error,

View File

@@ -47,18 +47,14 @@ export const syncLdapUser = async (userId: number) => {
}; };
export const getUserInfo = async (username: string): Promise<LdapUser> => { export const getUserInfo = async (username: string): Promise<LdapUser> => {
try { const response = await getBackendSrv().get(`/api/admin/ldap/${username}`);
const response = await getBackendSrv().get(`/api/admin/ldap/${username}`); const { name, surname, email, login, isGrafanaAdmin, isDisabled, roles, teams } = response;
const { name, surname, email, login, isGrafanaAdmin, isDisabled, roles, teams } = response; return {
return { info: { name, surname, email, login },
info: { name, surname, email, login }, permissions: { isGrafanaAdmin, isDisabled },
permissions: { isGrafanaAdmin, isDisabled }, roles,
roles, teams,
teams, };
};
} catch (error) {
throw error;
}
}; };
export const getUser = async (id: number): Promise<User> => { export const getUser = async (id: number): Promise<User> => {

View File

@@ -1,34 +1,46 @@
// Libraries
import React, { PureComponent } from 'react'; import React, { PureComponent } from 'react';
import { hot } from 'react-hot-loader';
import { connect } from 'react-redux';
import { css } from 'emotion';
import { Button } from '@grafana/ui';
// Services & Utils import { AngularComponent, getAngularLoader, getDataSourceSrv } from '@grafana/runtime';
import { AngularComponent, getAngularLoader } from '@grafana/runtime';
import appEvents from 'app/core/app_events'; import appEvents from 'app/core/app_events';
import { getAlertingValidationMessage } from './getAlertingValidationMessage';
// Components
import { EditorTabBody, EditorToolbarView } from '../dashboard/panel_editor/EditorTabBody'; import { EditorTabBody, EditorToolbarView } from '../dashboard/panel_editor/EditorTabBody';
import EmptyListCTA from 'app/core/components/EmptyListCTA/EmptyListCTA'; import EmptyListCTA from 'app/core/components/EmptyListCTA/EmptyListCTA';
import StateHistory from './StateHistory'; import StateHistory from './StateHistory';
import 'app/features/alerting/AlertTabCtrl'; import 'app/features/alerting/AlertTabCtrl';
// Types
import { DashboardModel } from '../dashboard/state/DashboardModel'; import { DashboardModel } from '../dashboard/state/DashboardModel';
import { PanelModel } from '../dashboard/state/PanelModel'; import { PanelModel } from '../dashboard/state/PanelModel';
import { TestRuleResult } from './TestRuleResult'; import { TestRuleResult } from './TestRuleResult';
import { AlertBox } from 'app/core/components/AlertBox/AlertBox'; import { AlertBox } from 'app/core/components/AlertBox/AlertBox';
import { AppNotificationSeverity } from 'app/types'; import { AppNotificationSeverity, StoreState } from 'app/types';
import { PanelEditorTabIds, getPanelEditorTab } from '../dashboard/panel_editor/state/reducers';
import { changePanelEditorTab } from '../dashboard/panel_editor/state/actions';
interface Props { interface Props {
angularPanel?: AngularComponent; angularPanel?: AngularComponent;
dashboard: DashboardModel; dashboard: DashboardModel;
panel: PanelModel; panel: PanelModel;
changePanelEditorTab: typeof changePanelEditorTab;
} }
export class AlertTab extends PureComponent<Props> { interface State {
validatonMessage: string;
}
class UnConnectedAlertTab extends PureComponent<Props, State> {
element: any; element: any;
component: AngularComponent; component: AngularComponent;
panelCtrl: any; panelCtrl: any;
state: State = {
validatonMessage: '',
};
componentDidMount() { componentDidMount() {
if (this.shouldLoadAlertTab()) { if (this.shouldLoadAlertTab()) {
this.loadAlertTab(); this.loadAlertTab();
@@ -51,8 +63,8 @@ export class AlertTab extends PureComponent<Props> {
} }
} }
loadAlertTab() { async loadAlertTab() {
const { angularPanel } = this.props; const { angularPanel, panel } = this.props;
const scope = angularPanel.getScope(); const scope = angularPanel.getScope();
@@ -71,6 +83,17 @@ export class AlertTab extends PureComponent<Props> {
const scopeProps = { ctrl: this.panelCtrl }; const scopeProps = { ctrl: this.panelCtrl };
this.component = loader.load(this.element, scopeProps, template); this.component = loader.load(this.element, scopeProps, template);
const validatonMessage = await getAlertingValidationMessage(
panel.transformations,
panel.targets,
getDataSourceSrv(),
panel.datasource
);
if (validatonMessage) {
this.setState({ validatonMessage });
}
} }
stateHistory = (): EditorToolbarView => { stateHistory = (): EditorToolbarView => {
@@ -128,19 +151,39 @@ export class AlertTab extends PureComponent<Props> {
this.forceUpdate(); this.forceUpdate();
}; };
switchToQueryTab = () => {
const { changePanelEditorTab } = this.props;
changePanelEditorTab(getPanelEditorTab(PanelEditorTabIds.Queries));
};
renderValidationMessage = () => {
const { validatonMessage } = this.state;
return (
<div
className={css`
width: 508px;
margin: 128px auto;
`}
>
<h2>{validatonMessage}</h2>
<br />
<div className="gf-form-group">
<Button size={'md'} variant={'secondary'} icon="fa fa-arrow-left" onClick={this.switchToQueryTab}>
Go back to Queries
</Button>
</div>
</div>
);
};
render() { render() {
const { alert, transformations } = this.props.panel; const { alert, transformations } = this.props.panel;
const hasTransformations = transformations && transformations.length; const { validatonMessage } = this.state;
const hasTransformations = transformations && transformations.length > 0;
if (!alert && hasTransformations) { if (!alert && validatonMessage) {
return ( return this.renderValidationMessage();
<EditorTabBody heading="Alert">
<AlertBox
severity={AppNotificationSeverity.Warning}
title="Transformations are not supported in alert queries"
/>
</EditorTabBody>
);
} }
const toolbarItems = alert ? [this.stateHistory(), this.testRule(), this.deleteAlert()] : []; const toolbarItems = alert ? [this.stateHistory(), this.testRule(), this.deleteAlert()] : [];
@@ -163,9 +206,20 @@ export class AlertTab extends PureComponent<Props> {
)} )}
<div ref={element => (this.element = element)} /> <div ref={element => (this.element = element)} />
{!alert && <EmptyListCTA {...model} />} {!alert && !validatonMessage && <EmptyListCTA {...model} />}
</> </>
</EditorTabBody> </EditorTabBody>
); );
} }
} }
export const mapStateToProps = (state: StoreState) => ({});
const mapDispatchToProps = { changePanelEditorTab };
export const AlertTab = hot(module)(
connect(
mapStateToProps,
mapDispatchToProps
)(UnConnectedAlertTab)
);

Some files were not shown because too many files have changed in this diff Show More