Compare commits

...

64 Commits

Author SHA1 Message Date
Hugo Häggmark
21bf8b71bc Chore: Fixes test based on master branch 2020-02-06 12:53:10 +03:00
Alexander Zobnin
dcb8beecb1 release 6.6.1 2020-02-06 12:53:10 +03:00
Leonard Gram
64568a1938 Quota: Makes sure we provide the request context to the quota service (#21949)
It was missing for ldap_login which means that the first signup failed
for users with LDAP+quota enabled. There's also potential cases where we
can't provide a request context (background jobs) which is also covered,
but needs a refactoring.

(cherry picked from commit 59530e4758)
2020-02-06 12:53:10 +03:00
Sofia Papagiannaki
9b3241a629 Annotations: Change indices and rewrites annotation find query to improve database query performance (#21915)
Drop indices and create new ones and rewrites annotation find query
to address performance issues when querying annotation table and
there is a large amount of rows.

Fixes #21902

Co-authored-by: Marcus Efraimsson <marcus.efraimsson@gmail.com>
Co-authored-by: Kyle Brandt <kyle@kbrandt.com>
(cherry picked from commit 5ae95190ed)
2020-02-06 12:53:10 +03:00
Hugo Häggmark
4bc6bf5e54 Prometheus: Fixes default step value for annotation query (#21934)
Fixes #21914

(cherry picked from commit 26d71c90f5)
2020-02-06 12:53:10 +03:00
Dominik Prokop
828ba74674 Dashboard edit: Fix 404 when making dashboard editable
(cherry picked from commit 90d415861d)
2020-02-06 12:53:10 +03:00
Mark Carey
568bbf4ff7 Metrics: Adds back missing summary quantiles (#21858)
Adds back missing summary quantiles which was mistakenly
removed in v6.6.0.

Fixes #21857

(cherry picked from commit 28230bbf52)
2020-02-06 12:53:10 +03:00
Ivana Huckova
17fc5251e1 grafana/ui: Fix displaying of bars in React Graph (#21922)
(cherry picked from commit 88226672f1)
2020-02-06 12:53:10 +03:00
Erik Sundell
b1d3fec9a8 Fix formatting (#21894)
(cherry picked from commit 78b1ab8360)
2020-02-06 12:53:10 +03:00
Edgar Orendain
12d3576666 Graph Panel: Fixed typo in thresholds form (#21903)
(cherry picked from commit bb8e15ceab)
2020-02-06 12:53:10 +03:00
Tobias Skarhed
476f9b6224 Disable logging in button (#21900)
(cherry picked from commit 959c49f6d8)
2020-02-06 12:53:10 +03:00
Jorge Luis Betancourt
29c6fa4114 Datasource: Show access (Browser/Server) select on the Prometheus datasource (#21833)
* Datasource: Show access (Browser/Server) select on the Prometheus datasource configuration editor

* Trigger build

(cherry picked from commit 96099636dc)
2020-02-06 12:53:10 +03:00
Shavonn Brown
248f73a00f deps so can mock in tests (#21827)
(cherry picked from commit c4e3110034)
2020-02-06 12:53:10 +03:00
Peter Holmberg
4fa2e9b90a Fix: Reimplement HideFromTabs in Tabs component (#21863)
* reimplement hidefromtabs

* remove console log

* going with option b instead

* less explicit

(cherry picked from commit 93195facba)
2020-02-06 12:53:10 +03:00
Marcus Efraimsson
04c2e41733 Image Rendering: Fix render of graph panel legend aligned to the right using Grafana image renderer plugin/service (#21854)
Don't render class body--phantomjs on body element when
PhantomJS renderer not is in use.

Fixes #21830

(cherry picked from commit 6e80315531)
2020-02-06 12:53:10 +03:00
Dominik Prokop
4c21a1e016 grafana/toolkit: Fix failing linter when there were lint issues (#21849)
(cherry picked from commit f8654a3a2f)
2020-02-06 12:53:10 +03:00
Torkel Ödegaard
604a603e82 DatasourceSettings: Fixed issue navigating away from data source settings page (#21841)
(cherry picked from commit b7faa9023e)
2020-02-06 12:53:10 +03:00
Dominik Prokop
9dd964f503 AppPageCtrl: Fix digest issue with app page initialisation (#21847)
(cherry picked from commit 050d902ed1)
2020-02-06 12:53:10 +03:00
Dominik Prokop
338c2b738e Fix digest issue with query part editor's actions menu (#21834)
(cherry picked from commit 1ef91e3fc4)
2020-02-06 12:53:10 +03:00
Torkel Ödegaard
5bc6a3505d Graphite: Fixed issue with functions with multiple required params and no defaults caused params that could not be edited (groupByNodes groupByTags) (#21814)
* Graphite: Fixed issue functions with multiple required params and no defaults

* removed some prev changes

* Update public/app/plugins/datasource/graphite/func_editor.ts

Co-authored-by: Dominik Prokop <dominik.prokop@grafana.com>
(cherry picked from commit 0fd088c757)
2020-02-06 12:53:10 +03:00
Evgeny Bibko
3b5efdbc84 TimePicker: Should display in kiosk mode (#21816)
* Timepicker class fixed

* Missed arrow in dashboard title

(cherry picked from commit 7638156666)
2020-02-06 12:53:10 +03:00
Torkel Ödegaard
a3cea78f40 StatPanels: Fixed migration from old singlestat and default min & max being copied even when gauge was disbled (#21820)
(cherry picked from commit 13948c0b76)
2020-02-06 12:53:10 +03:00
Marcus Andersson
36f02aaef7 Fixed strict errors (#21823)
(cherry picked from commit ffe0a1f975)
2020-02-06 12:53:10 +03:00
Marcus Andersson
2aefb73876 Fix: prevents the BarGauge from exploding when the datasource returns empty result. (#21791)
* Fixed issue where gauge throw error on empty result.

* Some refactorings to improve the code.

* Added some tests to make sure this doesn't happen again.

(cherry picked from commit cab082438e)
2020-02-06 12:53:10 +03:00
Shavonn Brown
94c374d187 Azure Monitor: Fix Application Insights API key field to allow input (#21738)
* Fix update api key input

* update snapshot

(cherry picked from commit 0fa20cb231)
2020-02-06 12:53:10 +03:00
Andrej Ocenas
6eb60b943a Influxdb: Fix cascader when doing log query in explore (#21787)
* Fix cascader options and add tests

* Add comment

* Fix typo

(cherry picked from commit 85dad73e9d)
2020-02-06 12:53:10 +03:00
Leonard Gram
8c14a6e070 MSI: License for Enterprise (#21794)
(cherry picked from commit 20e96a9241)
2020-02-06 12:53:10 +03:00
Shavonn Brown
a069b5d639 Make importDataSourcePlugin cancelable (#21430)
* make importDataSourcePlugin cancelable

* fix imported plugin assignment

* init datasource plugin to redux

* remove commented

* testDataSource to redux

* add err console log

* isTesting is never used

* tests, loadError type

* more tests, testingStatus obj

(cherry picked from commit b3d5e678f4)
2020-02-06 12:53:10 +03:00
Marcus Efraimsson
49255fbb6a OpenTSDB: Add back missing ngInject (#21796)
Adds back missing ngInject on datasource constructor
to make it work again.

Fixes #21770

(cherry picked from commit b75412d6ae)
2020-02-06 12:53:10 +03:00
Emil Tullstedt
0c843ae8d2 Config: add meta feature toggle (#21786)
(cherry picked from commit e95bcc4ba2)
2020-02-06 12:53:10 +03:00
Ivana Huckova
52a5645c85 Logs panel: Rename labels to unique labels (#21783)
(cherry picked from commit b3bcbcccce)
2020-02-06 12:53:10 +03:00
Ryan McKinley
9ad66b7fed grafana/data: Add type for secure json in DataSourceAPI (#21772)
(cherry picked from commit 67c5531961)
2020-02-06 12:53:10 +03:00
kay delaney
af10ba3f1f Explore/Loki: Fix handling of legacy log row context request (#21767)
Closes #21695

(cherry picked from commit 7569a8608a)
2020-02-06 12:53:10 +03:00
Leonard Gram
5c11bbdfb4 release 6.6.0 2020-01-27 13:32:03 +01:00
Emil Tullstedt
872bc2d973 Footer: Display Grafana edition (#21717)
Co-authored-by: Torkel Ödegaard <torkel@grafana.com>
(cherry picked from commit 3fabbbff4d)
2020-01-27 13:32:03 +01:00
Andrej Ocenas
cbace87b56 Explore: Fix context view in logs, where some rows may have been filtered out. (#21729)
* Fix timestamp formats and use uid to filter context rows

* Remove timestamps from tests

(cherry picked from commit 0fda3c4f44)
2020-01-27 13:32:03 +01:00
Ryan McKinley
f59b9b6545 Toolkit: add canvas-mock to test setup (#21739)
(cherry picked from commit ed140346a7)
2020-01-27 13:32:03 +01:00
Tobias Skarhed
3ac81e50d7 TablePabel: Sanitize column link (#21735)
(cherry picked from commit 751eb2c8bb)
2020-01-27 13:32:03 +01:00
Tobias Skarhed
0378c66dcd Template vars: Add error message for failed query var (#21731)
(cherry picked from commit 4c41d7e7fb)
2020-01-27 13:32:03 +01:00
Torkel Ödegaard
79de911d0a Devenv: Fixed devenv dashboard template var datasource (#21715)
(cherry picked from commit b28eac2626)
2020-01-27 13:32:03 +01:00
Torkel Ödegaard
eecd09d1c8 Footer: added back missing footer to login page (#21720)
(cherry picked from commit 198f561541)
2020-01-27 13:32:03 +01:00
Marcus Efraimsson
14ae363aaa Admin: Viewer should not see link to teams in side menu (#21716)
Fixes so that viewers don't see a link to teams in side menu when
editors_can_admin setting is enabled.

(cherry picked from commit 63a912629d)
2020-01-27 13:32:03 +01:00
Dominik Prokop
18a92cc540 Annotations: Fix issue with annotation queries editors (#21712)
(cherry picked from commit d9e1cb44c8)
2020-01-27 13:32:03 +01:00
Dominik Prokop
cfb8912200 grafana/ui: Remove path import from grafana-data (#21707)
(cherry picked from commit 5e87af8b2a)
2020-01-27 13:32:03 +01:00
Ivana Huckova
47c57a1b9d Loki: Fix Loki with repeated panels and interpolation for Explore (#21685)
(cherry picked from commit e75840737e)
2020-01-27 13:32:03 +01:00
Torkel Ödegaard
ce3f43c6d0 StatPanels: Fixed possible migration issue (#21681)
(cherry picked from commit 8266959681)
2020-01-27 13:32:03 +01:00
Dominik Prokop
6717d43921 PhantomJS: Fix rendering of panels using Prometheus datasource
In 043bb59 a URLSearchParams usage was introduced which is not supported by PhantomJS. @babel/polyfill(deprecated) does not contain polyfill for URLSearchParams, hence the code (and Prometheus graphs rendering) was failing in PhantomJS environment.

The solution is to add https://www.npmjs.com/package/url-search-params-polyfill that takes care of the URLSearchParams

(cherry picked from commit cdfac32dfd)
2020-01-27 13:32:03 +01:00
Torkel Ödegaard
a951bab782 StatPanel: minor height tweak (#21663)
(cherry picked from commit a734cd3640)
2020-01-27 13:32:03 +01:00
Erik Sundell
841e140f5b Run query when region, namespace and metric changes (#21633)
(cherry picked from commit 296af36a6f)
2020-01-27 13:32:03 +01:00
kay delaney
bbd2014e9d Explore: Fixes some LogDetailsRow markup (#21671)
- Moves filter titles to icons rather than table cell
- Increases colspan of ad-hoc stats cell instead of
rendering empty cells for parsed fields

(cherry picked from commit a115729c55)
2020-01-27 13:32:03 +01:00
Sofia Papagiannaki
8ce48b98dc SQLStore: Fix PostgreSQL failure to create organisation for first time (#21648)
* Fix PostgreSQL failure to create organisation for first time

Co-authored-by: Arve Knudsen <arve.knudsen@gmail.com>
(cherry picked from commit 2283ceec09)
2020-01-27 13:32:03 +01:00
Erik Sundell
60419f7e72 CloudWatch: Auto period snap to next higher period (#21659)
* Snap to next higher period instead of closest

* Adjust period calc

(cherry picked from commit 685c9043a8)
2020-01-27 13:32:03 +01:00
Torkel Ödegaard
38e4db88d1 Login: Better auto sizing of login logo (#21645)
(cherry picked from commit 741e1bb7e9)
2020-01-27 13:32:03 +01:00
Torkel Ödegaard
d619c529f0 Alert: Minor tweak to work with license warnings (#21654)
(cherry picked from commit c228cde2b6)
2020-01-27 13:32:03 +01:00
Ryan McKinley
a8643d89be Toolkit: copyIfNonExistent order swapped (#21653)
(cherry picked from commit aee07949a3)
2020-01-27 13:32:03 +01:00
Ivana Huckova
a7c52c7dc8 Explore: Fix log level color and add tests (#21646)
(cherry picked from commit 6feb4a3221)
2020-01-27 13:32:03 +01:00
Torkel Ödegaard
7ad14532a3 Templating: A way to support object syntax for global vars (#21634)
(cherry picked from commit 92ef8644c5)
2020-01-27 13:32:03 +01:00
kenju
23f977f000 CloudWatch: Add DynamoDB Accelerator (DAX) metrics & dimensions (#21644)
Closes #10494

(cherry picked from commit 935d447c6a)
2020-01-27 13:32:03 +01:00
Emil Hessman
c172fe8915 Plugins: Apply adhoc filter in Elasticsearch logs query (#21346)
Fixes #21086

(cherry picked from commit 25e2f1c2dd)
2020-01-27 13:32:03 +01:00
Ryan McKinley
ddeee1820d TestData: allow negative values for random_walk parameters (#21627)
(cherry picked from commit 5f14d62c0d)
2020-01-27 13:32:03 +01:00
Sofia Papagiannaki
f28fd41c3b Update musl checksums (#21621)
(cherry picked from commit 2021a2df74)
2020-01-27 13:32:03 +01:00
Erik Sundell
57fb967fec CloudWatch: Expand dimension value in alias correctly (#21626)
* Make sure dimension value is being returned, and not just label

* Fix typo

(cherry picked from commit a1733bb412)
2020-01-27 13:32:03 +01:00
Leonard Gram
9046263122 Build: adds missing filters required to build oss msi (#21618)
(cherry picked from commit 7e0890d57d)
2020-01-20 14:39:44 +01:00
Leonard Gram
2306826cff release 6.6.0-beta1 2020-01-20 13:34:39 +01:00
130 changed files with 2863 additions and 570 deletions

View File

@@ -1211,6 +1211,7 @@ workflows:
- shellcheck - shellcheck
- mysql-integration-test - mysql-integration-test
- postgres-integration-test - postgres-integration-test
filters: *filter-only-master
- build-ee-msi: - build-ee-msi:
requires: requires:
- build-all-enterprise - build-all-enterprise
@@ -1323,6 +1324,7 @@ workflows:
- shellcheck - shellcheck
- mysql-integration-test - mysql-integration-test
- postgres-integration-test - postgres-integration-test
filters: *filter-only-release
- build-ee-msi: - build-ee-msi:
requires: requires:
- build-all-enterprise - build-all-enterprise

View File

@@ -222,7 +222,7 @@
"text": "A", "text": "A",
"value": ["A"] "value": ["A"]
}, },
"datasource": "TestData DB-1", "datasource": "gdev-testdata",
"definition": "*", "definition": "*",
"hide": 0, "hide": 0,
"includeAll": true, "includeAll": true,
@@ -247,7 +247,7 @@
"text": "AA", "text": "AA",
"value": ["AA"] "value": ["AA"]
}, },
"datasource": "TestData DB-1", "datasource": "gdev-testdata",
"definition": "$datacenter.*", "definition": "$datacenter.*",
"hide": 0, "hide": 0,
"includeAll": true, "includeAll": true,

View File

@@ -0,0 +1,71 @@
import { sleep, check, group } from 'k6';
import { createClient, createBasicAuthClient } from './modules/client.js';
import { createTestOrgIfNotExists, createTestdataDatasourceIfNotExists } from './modules/util.js';
export let options = {
noCookiesReset: true
};
let endpoint = __ENV.URL || 'http://localhost:3000';
const client = createClient(endpoint);
export const setup = () => {
const basicAuthClient = createBasicAuthClient(endpoint, 'admin', 'admin');
const orgId = createTestOrgIfNotExists(basicAuthClient);
const datasourceId = createTestdataDatasourceIfNotExists(basicAuthClient);
client.withOrgId(orgId);
return {
orgId: orgId,
datasourceId: datasourceId,
};
}
export default (data) => {
group("annotation by tag test", () => {
if (__ITER === 0) {
group("user authenticates thru ui with username and password", () => {
let res = client.ui.login('admin', 'admin');
check(res, {
'response status is 200': (r) => r.status === 200,
'response has cookie \'grafana_session\' with 32 characters': (r) => r.cookies.grafana_session[0].value.length === 32,
});
});
}
if (__ITER !== 0) {
group("batch tsdb requests with annotations by tag", () => {
const batchCount = 20;
const requests = [];
const payload = {
from: '1547765247624',
to: '1547768847624',
queries: [{
refId: 'A',
scenarioId: 'random_walk',
intervalMs: 10000,
maxDataPoints: 433,
datasourceId: data.datasourceId,
}]
};
requests.push({ method: 'GET', url: '/api/annotations?from=1580825186534&to=1580846786535' });
for (let n = 0; n < batchCount; n++) {
requests.push({ method: 'POST', url: '/api/tsdb/query', body: payload });
}
let responses = client.batch(requests);
for (let n = 0; n < batchCount; n++) {
check(responses[n], {
'response status is 200': (r) => r.status === 200,
});
}
});
}
});
sleep(5)
}
export const teardown = (data) => {}

View File

@@ -2,5 +2,5 @@
"npmClient": "yarn", "npmClient": "yarn",
"useWorkspaces": true, "useWorkspaces": true,
"packages": ["packages/*"], "packages": ["packages/*"],
"version": "6.6.0-pre" "version": "6.6.1"
} }

View File

@@ -3,7 +3,7 @@
"license": "Apache-2.0", "license": "Apache-2.0",
"private": true, "private": true,
"name": "grafana", "name": "grafana",
"version": "6.6.0-pre", "version": "6.6.1",
"repository": { "repository": {
"type": "git", "type": "git",
"url": "http://github.com/grafana/grafana.git" "url": "http://github.com/grafana/grafana.git"
@@ -266,6 +266,7 @@
"tether-drop": "https://github.com/torkelo/drop/tarball/master", "tether-drop": "https://github.com/torkelo/drop/tarball/master",
"tinycolor2": "1.4.1", "tinycolor2": "1.4.1",
"tti-polyfill": "0.2.2", "tti-polyfill": "0.2.2",
"url-search-params-polyfill": "7.0.1",
"xss": "1.0.3" "xss": "1.0.3"
}, },
"resolutions": { "resolutions": {

View File

@@ -2,7 +2,7 @@
"author": "Grafana Labs", "author": "Grafana Labs",
"license": "Apache-2.0", "license": "Apache-2.0",
"name": "@grafana/data", "name": "@grafana/data",
"version": "6.6.0-pre", "version": "6.6.1",
"description": "Grafana Data Library", "description": "Grafana Data Library",
"keywords": [ "keywords": [
"typescript" "typescript"

View File

@@ -1,5 +1,6 @@
import { Vector } from '../types/vector'; import { Vector } from '../types/vector';
import { DataFrame } from '../types/dataFrame'; import { DataFrame } from '../types/dataFrame';
import { DisplayProcessor } from '../types';
/** /**
* This abstraction will present the contents of a DataFrame as if * This abstraction will present the contents of a DataFrame as if
@@ -55,6 +56,20 @@ export class DataFrameView<T = any> implements Vector<T> {
return this.data.length; return this.data.length;
} }
getFieldDisplayProcessor(colIndex: number): DisplayProcessor | null {
if (!this.dataFrame || !this.dataFrame.fields) {
return null;
}
const field = this.dataFrame.fields[colIndex];
if (!field || !field.display) {
return null;
}
return field.display;
}
get(idx: number) { get(idx: number) {
this.index = idx; this.index = idx;
return this.obj; return this.obj;

View File

@@ -33,15 +33,16 @@ export type DataSourceOptionsType<DSType extends DataSourceApi<any, any>> = DSTy
export class DataSourcePlugin< export class DataSourcePlugin<
DSType extends DataSourceApi<TQuery, TOptions>, DSType extends DataSourceApi<TQuery, TOptions>,
TQuery extends DataQuery = DataSourceQueryType<DSType>, TQuery extends DataQuery = DataSourceQueryType<DSType>,
TOptions extends DataSourceJsonData = DataSourceOptionsType<DSType> TOptions extends DataSourceJsonData = DataSourceOptionsType<DSType>,
TSecureOptions = {}
> extends GrafanaPlugin<DataSourcePluginMeta<TOptions>> { > extends GrafanaPlugin<DataSourcePluginMeta<TOptions>> {
components: DataSourcePluginComponents<DSType, TQuery, TOptions> = {}; components: DataSourcePluginComponents<DSType, TQuery, TOptions, TSecureOptions> = {};
constructor(public DataSourceClass: DataSourceConstructor<DSType, TQuery, TOptions>) { constructor(public DataSourceClass: DataSourceConstructor<DSType, TQuery, TOptions>) {
super(); super();
} }
setConfigEditor(editor: ComponentType<DataSourcePluginOptionsEditorProps<TOptions>>) { setConfigEditor(editor: ComponentType<DataSourcePluginOptionsEditorProps<TOptions, TSecureOptions>>) {
this.components.ConfigEditor = editor; this.components.ConfigEditor = editor;
return this; return this;
} }
@@ -131,7 +132,8 @@ interface PluginMetaQueryOptions {
export interface DataSourcePluginComponents< export interface DataSourcePluginComponents<
DSType extends DataSourceApi<TQuery, TOptions>, DSType extends DataSourceApi<TQuery, TOptions>,
TQuery extends DataQuery = DataQuery, TQuery extends DataQuery = DataQuery,
TOptions extends DataSourceJsonData = DataSourceJsonData TOptions extends DataSourceJsonData = DataSourceJsonData,
TSecureOptions = {}
> { > {
QueryCtrl?: any; QueryCtrl?: any;
AnnotationsQueryCtrl?: any; AnnotationsQueryCtrl?: any;
@@ -141,7 +143,7 @@ export interface DataSourcePluginComponents<
ExploreMetricsQueryField?: ComponentType<ExploreQueryFieldProps<DSType, TQuery, TOptions>>; ExploreMetricsQueryField?: ComponentType<ExploreQueryFieldProps<DSType, TQuery, TOptions>>;
ExploreLogsQueryField?: ComponentType<ExploreQueryFieldProps<DSType, TQuery, TOptions>>; ExploreLogsQueryField?: ComponentType<ExploreQueryFieldProps<DSType, TQuery, TOptions>>;
ExploreStartPage?: ComponentType<ExploreStartPageProps>; ExploreStartPage?: ComponentType<ExploreStartPageProps>;
ConfigEditor?: ComponentType<DataSourcePluginOptionsEditorProps<TOptions>>; ConfigEditor?: ComponentType<DataSourcePluginOptionsEditorProps<TOptions, TSecureOptions>>;
MetadataInspector?: ComponentType<MetadataInspectorProps<DSType, TQuery, TOptions>>; MetadataInspector?: ComponentType<MetadataInspectorProps<DSType, TQuery, TOptions>>;
} }
@@ -276,7 +278,7 @@ export abstract class DataSourceApi<
*/ */
annotationQuery?(options: AnnotationQueryRequest<TQuery>): Promise<AnnotationEvent[]>; annotationQuery?(options: AnnotationQueryRequest<TQuery>): Promise<AnnotationEvent[]>;
interpolateVariablesInQueries?(queries: TQuery[]): TQuery[]; interpolateVariablesInQueries?(queries: TQuery[], scopedVars: ScopedVars | {}): TQuery[];
} }
export interface MetadataInspectorProps< export interface MetadataInspectorProps<

View File

@@ -56,7 +56,6 @@ export interface LogRowModel {
logLevel: LogLevel; logLevel: LogLevel;
raw: string; raw: string;
searchWords?: string[]; searchWords?: string[];
timestamp: string; // ISO with nanosec precision
timeFromNow: string; timeFromNow: string;
timeEpochMs: number; timeEpochMs: number;
timeLocal: string; timeLocal: string;

View File

@@ -52,6 +52,7 @@ export interface PanelModel<TOptions = any> {
id: number; id: number;
options: TOptions; options: TOptions;
pluginVersion?: string; pluginVersion?: string;
scopedVars?: ScopedVars;
} }
/** /**

View File

@@ -6,6 +6,7 @@ import {
getParser, getParser,
LogsParsers, LogsParsers,
calculateStats, calculateStats,
getLogLevelFromKey,
} from './logs'; } from './logs';
describe('getLoglevel()', () => { describe('getLoglevel()', () => {
@@ -23,6 +24,10 @@ describe('getLoglevel()', () => {
expect(getLogLevel('[Warn]')).toBe('warning'); expect(getLogLevel('[Warn]')).toBe('warning');
}); });
it('returns correct log level when level is capitalized', () => {
expect(getLogLevel('WARN')).toBe(LogLevel.warn);
});
it('returns log level on line contains a log level', () => { it('returns log level on line contains a log level', () => {
expect(getLogLevel('warn: it is looking bad')).toBe(LogLevel.warn); expect(getLogLevel('warn: it is looking bad')).toBe(LogLevel.warn);
expect(getLogLevel('2007-12-12 12:12:12 [WARN]: it is looking bad')).toBe(LogLevel.warn); expect(getLogLevel('2007-12-12 12:12:12 [WARN]: it is looking bad')).toBe(LogLevel.warn);
@@ -33,6 +38,15 @@ describe('getLoglevel()', () => {
}); });
}); });
describe('getLogLevelFromKey()', () => {
it('returns correct log level', () => {
expect(getLogLevelFromKey('info')).toBe(LogLevel.info);
});
it('returns correct log level when level is capitalized', () => {
expect(getLogLevelFromKey('INFO')).toBe(LogLevel.info);
});
});
describe('calculateLogsLabelStats()', () => { describe('calculateLogsLabelStats()', () => {
test('should return no stats for empty rows', () => { test('should return no stats for empty rows', () => {
expect(calculateLogsLabelStats([], '')).toEqual([]); expect(calculateLogsLabelStats([], '')).toEqual([]);

View File

@@ -33,7 +33,7 @@ export function getLogLevel(line: string): LogLevel {
} }
export function getLogLevelFromKey(key: string): LogLevel { export function getLogLevelFromKey(key: string): LogLevel {
const level = (LogLevel as any)[key]; const level = (LogLevel as any)[key.toLowerCase()];
if (level) { if (level) {
return level; return level;
} }

View File

@@ -2,7 +2,7 @@
"author": "Grafana Labs", "author": "Grafana Labs",
"license": "Apache-2.0", "license": "Apache-2.0",
"name": "@grafana/e2e", "name": "@grafana/e2e",
"version": "6.4.0-pre", "version": "6.6.1",
"description": "Grafana End to End Test Library", "description": "Grafana End to End Test Library",
"keywords": [ "keywords": [
"grafana", "grafana",

View File

@@ -2,7 +2,7 @@
"author": "Grafana Labs", "author": "Grafana Labs",
"license": "Apache-2.0", "license": "Apache-2.0",
"name": "@grafana/runtime", "name": "@grafana/runtime",
"version": "6.6.0-pre", "version": "6.6.1",
"description": "Grafana Runtime Library", "description": "Grafana Runtime Library",
"keywords": [ "keywords": [
"grafana", "grafana",
@@ -21,8 +21,8 @@
"build": "grafana-toolkit package:build --scope=runtime" "build": "grafana-toolkit package:build --scope=runtime"
}, },
"dependencies": { "dependencies": {
"@grafana/data": "^6.6.0-pre", "@grafana/data": "6.6.1",
"@grafana/ui": "^6.6.0-pre", "@grafana/ui": "6.6.1",
"systemjs": "0.20.19", "systemjs": "0.20.19",
"systemjs-plugin-css": "0.1.37" "systemjs-plugin-css": "0.1.37"
}, },

View File

@@ -7,6 +7,7 @@ export interface BuildInfo {
commit: string; commit: string;
isEnterprise: boolean; // deprecated: use licenseInfo.hasLicense instead isEnterprise: boolean; // deprecated: use licenseInfo.hasLicense instead
env: string; env: string;
edition: string;
latestVersion: string; latestVersion: string;
hasUpdate: boolean; hasUpdate: boolean;
} }
@@ -16,11 +17,14 @@ interface FeatureToggles {
inspect: boolean; inspect: boolean;
expressions: boolean; expressions: boolean;
newEdit: boolean; newEdit: boolean;
meta: boolean;
} }
interface LicenseInfo { interface LicenseInfo {
hasLicense: boolean; hasLicense: boolean;
expiry: number; expiry: number;
licenseUrl: string;
stateInfo: string;
} }
export class GrafanaBootConfig { export class GrafanaBootConfig {
@@ -60,8 +64,10 @@ export class GrafanaBootConfig {
inspect: false, inspect: false,
expressions: false, expressions: false,
newEdit: false, newEdit: false,
meta: false,
}; };
licenseInfo: LicenseInfo = {} as LicenseInfo; licenseInfo: LicenseInfo = {} as LicenseInfo;
phantomJSRenderer = false;
constructor(options: GrafanaBootConfig) { constructor(options: GrafanaBootConfig) {
this.theme = options.bootData.user.lightTheme ? getTheme(GrafanaThemeType.Light) : getTheme(GrafanaThemeType.Dark); this.theme = options.bootData.user.lightTheme ? getTheme(GrafanaThemeType.Light) : getTheme(GrafanaThemeType.Dark);

View File

@@ -2,7 +2,7 @@
"author": "Grafana Labs", "author": "Grafana Labs",
"license": "Apache-2.0", "license": "Apache-2.0",
"name": "@grafana/toolkit", "name": "@grafana/toolkit",
"version": "6.6.0-pre", "version": "6.6.1",
"description": "Grafana Toolkit", "description": "Grafana Toolkit",
"keywords": [ "keywords": [
"grafana", "grafana",
@@ -28,8 +28,8 @@
"dependencies": { "dependencies": {
"@babel/core": "7.6.4", "@babel/core": "7.6.4",
"@babel/preset-env": "7.6.3", "@babel/preset-env": "7.6.3",
"@grafana/data": "^6.6.0-pre", "@grafana/data": "6.6.1",
"@grafana/ui": "^6.6.0-pre", "@grafana/ui": "6.6.1",
"@types/command-exists": "^1.2.0", "@types/command-exists": "^1.2.0",
"@types/execa": "^0.9.0", "@types/execa": "^0.9.0",
"@types/expect-puppeteer": "3.3.1", "@types/expect-puppeteer": "3.3.1",
@@ -62,9 +62,12 @@
"html-webpack-plugin": "^3.2.0", "html-webpack-plugin": "^3.2.0",
"inquirer": "^6.3.1", "inquirer": "^6.3.1",
"jest": "24.8.0", "jest": "24.8.0",
"jest-canvas-mock": "2.1.2",
"jest-cli": "^24.8.0", "jest-cli": "^24.8.0",
"jest-coverage-badges": "^1.1.2", "jest-coverage-badges": "^1.1.2",
"jest-junit": "^6.4.0", "jest-junit": "^6.4.0",
"less": "^3.10.3",
"less-loader": "^5.0.0",
"lodash": "4.17.15", "lodash": "4.17.15",
"md5-file": "^4.0.0", "md5-file": "^4.0.0",
"mini-css-extract-plugin": "^0.7.0", "mini-css-extract-plugin": "^0.7.0",
@@ -95,9 +98,7 @@
"tslint-config-prettier": "^1.18.0", "tslint-config-prettier": "^1.18.0",
"typescript": "3.7.2", "typescript": "3.7.2",
"url-loader": "^2.0.1", "url-loader": "^2.0.1",
"webpack": "4.35.0", "webpack": "4.35.0"
"less": "^3.10.3",
"less-loader": "^5.0.0"
}, },
"_moduleAliases": { "_moduleAliases": {
"puppeteer": "node_modules/puppeteer-core" "puppeteer": "node_modules/puppeteer-core"

View File

@@ -40,13 +40,13 @@ export const prepare = useSpinner<void>('Preparing', async () => {
await Promise.all([ await Promise.all([
// Copy only if local tsconfig does not exist. Otherwise this will work, but have odd behavior // Copy only if local tsconfig does not exist. Otherwise this will work, but have odd behavior
copyIfNonExistent( copyIfNonExistent(
resolvePath(process.cwd(), 'tsconfig.json'), resolvePath(__dirname, '../../config/tsconfig.plugin.local.json'),
resolvePath(__dirname, '../../config/tsconfig.plugin.local.json') resolvePath(process.cwd(), 'tsconfig.json')
), ),
// Copy only if local prettierrc does not exist. Otherwise this will work, but have odd behavior // Copy only if local prettierrc does not exist. Otherwise this will work, but have odd behavior
copyIfNonExistent( copyIfNonExistent(
resolvePath(process.cwd(), '.prettierrc.js'), resolvePath(__dirname, '../../config/prettier.plugin.rc.js'),
resolvePath(__dirname, '../../config/prettier.plugin.rc.js') resolvePath(process.cwd(), '.prettierrc.js')
), ),
]); ]);
@@ -149,7 +149,9 @@ export const lintPlugin = useSpinner<Fixable>('Linting', async ({ fix }) => {
if (lintResults.length > 0) { if (lintResults.length > 0) {
console.log('\n'); console.log('\n');
const failures: RuleFailure[] = lintResults.flat(); const failures = lintResults.reduce<RuleFailure[]>((failures, result) => {
return [...failures, ...result.failures];
}, []);
failures.forEach(f => { failures.forEach(f => {
// tslint:disable-next-line // tslint:disable-next-line
console.log( console.log(

View File

@@ -50,7 +50,7 @@ export const jestConfig = (baseDir: string = process.cwd()) => {
const setupFile = getSetupFile(setupFilePath); const setupFile = getSetupFile(setupFilePath);
const shimsFile = getSetupFile(shimsFilePath); const shimsFile = getSetupFile(shimsFilePath);
const setupFiles = [setupFile, shimsFile].filter(f => f); const setupFiles = [setupFile, shimsFile, 'jest-canvas-mock'].filter(f => f);
const defaultJestConfig = { const defaultJestConfig = {
preset: 'ts-jest', preset: 'ts-jest',
verbose: false, verbose: false,

View File

@@ -2,7 +2,7 @@
"author": "Grafana Labs", "author": "Grafana Labs",
"license": "Apache-2.0", "license": "Apache-2.0",
"name": "@grafana/ui", "name": "@grafana/ui",
"version": "6.6.0-pre", "version": "6.6.1",
"description": "Grafana Components Library", "description": "Grafana Components Library",
"keywords": [ "keywords": [
"grafana", "grafana",
@@ -25,7 +25,7 @@
"build": "grafana-toolkit package:build --scope=ui" "build": "grafana-toolkit package:build --scope=ui"
}, },
"dependencies": { "dependencies": {
"@grafana/data": "^6.6.0-pre", "@grafana/data": "6.6.1",
"@grafana/slate-react": "0.22.9-grafana", "@grafana/slate-react": "0.22.9-grafana",
"@torkelo/react-select": "2.1.1", "@torkelo/react-select": "2.1.1",
"@types/react-color": "2.17.0", "@types/react-color": "2.17.0",

View File

@@ -54,6 +54,7 @@
background: none; background: none;
display: flex; display: flex;
align-items: center; align-items: center;
.fa { .fa {
align-self: flex-end; align-self: flex-end;
font-size: 21px; font-size: 21px;
@@ -78,6 +79,11 @@
.alert-body { .alert-body {
flex-grow: 1; flex-grow: 1;
a {
color: $white;
text-decoration: underline;
}
} }
.alert-icon-on-top { .alert-icon-on-top {

View File

@@ -131,7 +131,8 @@ export class BarGauge extends PureComponent<Props> {
}; };
} }
const color = display(positionValue).color; const color = display ? display(positionValue).color : null;
if (color) { if (color) {
// if we are past real value the cell is not "on" // if we are past real value the cell is not "on"
if (value === null || (positionValue !== null && positionValue > value.numeric)) { if (value === null || (positionValue !== null && positionValue > value.numeric)) {

View File

@@ -363,8 +363,9 @@ export class StackedWithChartLayout extends BigValueLayout {
// make title fontsize it's a bit smaller than valueFontSize // make title fontsize it's a bit smaller than valueFontSize
this.titleFontSize = Math.min(this.valueFontSize * 0.7, this.titleFontSize); this.titleFontSize = Math.min(this.valueFontSize * 0.7, this.titleFontSize);
// make chart take up onused space // make chart take up onused space
this.chartHeight = height - this.titleFontSize * LINE_HEIGHT - this.valueFontSize * LINE_HEIGHT + height * 0.05; this.chartHeight = height - this.titleFontSize * LINE_HEIGHT - this.valueFontSize * LINE_HEIGHT;
} }
getValueAndTitleContainerStyles() { getValueAndTitleContainerStyles() {

View File

@@ -29,9 +29,11 @@ interface CascaderState {
export interface CascaderOption { export interface CascaderOption {
value: any; value: any;
label: string; label: string;
// Items will be just flattened into the main list of items recursively.
items?: CascaderOption[]; items?: CascaderOption[];
disabled?: boolean; disabled?: boolean;
title?: string; title?: string;
// Children will be shown in a submenu.
children?: CascaderOption[]; children?: CascaderOption[];
} }

View File

@@ -319,7 +319,7 @@ export class Graph extends PureComponent<GraphProps, GraphState> {
// Dividig the width by 1.5 to make the bars not touch each other // Dividig the width by 1.5 to make the bars not touch each other
barWidth: showBars ? this.getBarWidth() / 1.5 : 1, barWidth: showBars ? this.getBarWidth() / 1.5 : 1,
zero: false, zero: false,
lineWidth: lineWidth, lineWidth: 0,
}, },
shadowSize: 0, shadowSize: 0,
}, },

View File

@@ -20,7 +20,6 @@ const setup = (propOverrides?: Partial<Props>, rowOverrides?: Partial<LogRowMode
hasAnsi: false, hasAnsi: false,
entry: '', entry: '',
raw: '', raw: '',
timestamp: '',
uid: '0', uid: '0',
labels: {}, labels: {},
...(rowOverrides || {}), ...(rowOverrides || {}),

View File

@@ -92,17 +92,28 @@ class UnThemedLogDetailsRow extends PureComponent<Props, State> {
return ( return (
<tr className={cx(style.logDetailsValue, { [styles.noHoverBackground]: showFieldsStats })}> <tr className={cx(style.logDetailsValue, { [styles.noHoverBackground]: showFieldsStats })}>
{/* Action buttons - show stats/filter results */} {/* Action buttons - show stats/filter results */}
<td title="Ad-hoc statistics" onClick={this.showStats} className={style.logsDetailsIcon}> <td className={style.logsDetailsIcon} colSpan={isLabel ? undefined : 3}>
<i className={`fa fa-signal ${styles.hoverCursor}`} /> <i title="Ad-hoc statistics" className={`fa fa-signal ${styles.hoverCursor}`} onClick={this.showStats} />
</td> </td>
<td title="Filter for value" onClick={() => isLabel && this.filterLabel()} className={style.logsDetailsIcon}> {isLabel && (
{isLabel && <i className={`fa fa-search-plus ${styles.hoverCursor}`} />} <>
<td className={style.logsDetailsIcon}>
<i
title="Filter for value"
className={`fa fa-search-plus ${styles.hoverCursor}`}
onClick={this.filterLabel}
/>
</td> </td>
<td className={style.logsDetailsIcon}>
<td title="Filter out value" onClick={() => isLabel && this.filterOutLabel()} className={style.logsDetailsIcon}> <i
{isLabel && <i className={`fa fa-search-minus ${styles.hoverCursor}`} />} title="Filter out value"
className={`fa fa-search-minus ${styles.hoverCursor}`}
onClick={this.filterOutLabel}
/>
</td> </td>
</>
)}
{/* Key - value columns */} {/* Key - value columns */}
<td className={style.logDetailsLabel}>{parsedKey}</td> <td className={style.logDetailsLabel}>{parsedKey}</td>

View File

@@ -3,7 +3,38 @@ import { getRowContexts } from './LogRowContextProvider';
describe('getRowContexts', () => { describe('getRowContexts', () => {
describe('when called with a DataFrame and results are returned', () => { describe('when called with a DataFrame and results are returned', () => {
it('then the result should be in correct format', async () => { it('then the result should be in correct format and filtered', async () => {
const firstResult = new MutableDataFrame({
refId: 'B',
fields: [
{ name: 'ts', type: FieldType.time, values: [3, 2, 1] },
{ name: 'line', type: FieldType.string, values: ['3', '2', '1'], labels: {} },
{ name: 'id', type: FieldType.string, values: ['3', '2', '1'], labels: {} },
],
});
const secondResult = new MutableDataFrame({
refId: 'B',
fields: [
{ name: 'ts', type: FieldType.time, values: [6, 5, 4] },
{ name: 'line', type: FieldType.string, values: ['6', '5', '4'], labels: {} },
{ name: 'id', type: FieldType.string, values: ['6', '5', '4'], labels: {} },
],
});
let called = false;
const getRowContextMock = (row: LogRowModel, options?: any): Promise<DataQueryResponse> => {
if (!called) {
called = true;
return Promise.resolve({ data: [firstResult] });
}
return Promise.resolve({ data: [secondResult] });
};
const result = await getRowContexts(getRowContextMock, row, 10);
expect(result).toEqual({ data: [[['3', '2']], [['6', '5', '4']]], errors: ['', ''] });
});
it('then the result should be in correct format and filtered without uid', async () => {
const firstResult = new MutableDataFrame({ const firstResult = new MutableDataFrame({
refId: 'B', refId: 'B',
fields: [ fields: [
@@ -18,23 +49,6 @@ describe('getRowContexts', () => {
{ name: 'line', type: FieldType.string, values: ['6', '5', '4'], labels: {} }, { name: 'line', type: FieldType.string, values: ['6', '5', '4'], labels: {} },
], ],
}); });
const row: LogRowModel = {
entryFieldIndex: 0,
rowIndex: 0,
dataFrame: new MutableDataFrame(),
entry: '4',
labels: (null as any) as Labels,
hasAnsi: false,
raw: '4',
logLevel: LogLevel.info,
timeEpochMs: 4,
timeFromNow: '',
timeLocal: '',
timeUtc: '',
timestamp: '4',
uid: '1',
};
let called = false; let called = false;
const getRowContextMock = (row: LogRowModel, options?: any): Promise<DataQueryResponse> => { const getRowContextMock = (row: LogRowModel, options?: any): Promise<DataQueryResponse> => {
if (!called) { if (!called) {
@@ -46,7 +60,7 @@ describe('getRowContexts', () => {
const result = await getRowContexts(getRowContextMock, row, 10); const result = await getRowContexts(getRowContextMock, row, 10);
expect(result).toEqual({ data: [[['3', '2', '1']], [['6', '5', '4']]], errors: ['', ''] }); expect(result).toEqual({ data: [[['3', '2', '1']], [['6', '5']]], errors: ['', ''] });
}); });
}); });
@@ -54,23 +68,6 @@ describe('getRowContexts', () => {
it('then the result should be in correct format', async () => { it('then the result should be in correct format', async () => {
const firstError = new Error('Error 1'); const firstError = new Error('Error 1');
const secondError = new Error('Error 2'); const secondError = new Error('Error 2');
const row: LogRowModel = {
entryFieldIndex: 0,
rowIndex: 0,
dataFrame: new MutableDataFrame(),
entry: '4',
labels: (null as any) as Labels,
hasAnsi: false,
raw: '4',
logLevel: LogLevel.info,
timeEpochMs: 4,
timeFromNow: '',
timeLocal: '',
timeUtc: '',
timestamp: '4',
uid: '1',
};
let called = false; let called = false;
const getRowContextMock = (row: LogRowModel, options?: any): Promise<DataQueryResponse> => { const getRowContextMock = (row: LogRowModel, options?: any): Promise<DataQueryResponse> => {
if (!called) { if (!called) {
@@ -86,3 +83,19 @@ describe('getRowContexts', () => {
}); });
}); });
}); });
const row: LogRowModel = {
entryFieldIndex: 0,
rowIndex: 0,
dataFrame: new MutableDataFrame(),
entry: '4',
labels: (null as any) as Labels,
hasAnsi: false,
raw: '4',
logLevel: LogLevel.info,
timeEpochMs: 4,
timeFromNow: '',
timeLocal: '',
timeUtc: '',
uid: '1',
};

View File

@@ -1,5 +1,5 @@
import { LogRowModel, toDataFrame, Field } from '@grafana/data'; import { LogRowModel, toDataFrame, Field, FieldCache } from '@grafana/data';
import { useState, useEffect } from 'react'; import React, { useState, useEffect } from 'react';
import flatten from 'lodash/flatten'; import flatten from 'lodash/flatten';
import useAsync from 'react-use/lib/useAsync'; import useAsync from 'react-use/lib/useAsync';
@@ -45,7 +45,8 @@ export const getRowContexts = async (
limit, limit,
}), }),
getRowContext(row, { getRowContext(row, {
limit: limit + 1, // Lets add one more to the limit as we're filtering out one row see comment below // The start time is inclusive so we will get the one row we are using as context entry
limit: limit + 1,
direction: 'FORWARD', direction: 'FORWARD',
}), }),
]; ];
@@ -62,17 +63,34 @@ export const getRowContexts = async (
const data: any[] = []; const data: any[] = [];
for (let index = 0; index < dataResult.data.length; index++) { for (let index = 0; index < dataResult.data.length; index++) {
const dataFrame = toDataFrame(dataResult.data[index]); const dataFrame = toDataFrame(dataResult.data[index]);
const timestampField: Field<string> = dataFrame.fields.filter(field => field.name === 'ts')[0]; const fieldCache = new FieldCache(dataFrame);
const timestampField: Field<string> = fieldCache.getFieldByName('ts')!;
const idField: Field<string> | undefined = fieldCache.getFieldByName('id');
for (let fieldIndex = 0; fieldIndex < timestampField.values.length; fieldIndex++) { for (let fieldIndex = 0; fieldIndex < timestampField.values.length; fieldIndex++) {
const timestamp = timestampField.values.get(fieldIndex); // TODO: this filtering is datasource dependant so it will make sense to move it there so the API is
// to return correct list of lines handling inclusive ranges or how to filter the correct line on the
// datasource.
// We need to filter out the row we're basing our search from because of how start/end params work in Loki API // Filter out the row that is the one used as a focal point for the context as we will get it in one of the
// see https://github.com/grafana/loki/issues/597#issuecomment-506408980 // requests.
// the alternative to create our own add 1 nanosecond method to the a timestamp string would be quite complex if (idField) {
if (timestamp === row.timestamp) { // For Loki this means we filter only the one row. Issue is we could have other rows logged at the same
// ns which came before but they come in the response that search for logs after. This means right now
// we will show those as if they came after. This is not strictly correct but seems better than loosing them
// and making this correct would mean quite a bit of complexity to shuffle things around and messing up
//counts.
if (idField.values.get(fieldIndex) === row.uid) {
continue; continue;
} }
} else {
// Fallback to timestamp. This should not happen right now as this feature is implemented only for loki
// and that has ID. Later this branch could be used in other DS but mind that this could also filter out
// logs which were logged in the same timestamp and that can be a problem depending on the precision.
if (parseInt(timestampField.values.get(fieldIndex), 10) === row.timeEpochMs) {
continue;
}
}
const lineField: Field<string> = dataFrame.fields.filter(field => field.name === 'line')[0]; const lineField: Field<string> = dataFrame.fields.filter(field => field.name === 'line')[0];
const line = lineField.values.get(fieldIndex); // assuming that both fields have same length const line = lineField.values.get(fieldIndex); // assuming that both fields have same length

View File

@@ -109,7 +109,6 @@ const makeLog = (overrides: Partial<LogRowModel>): LogRowModel => {
hasAnsi: false, hasAnsi: false,
labels: {}, labels: {},
raw: entry, raw: entry,
timestamp: '',
timeFromNow: '', timeFromNow: '',
timeEpochMs: 1, timeEpochMs: 1,
timeLocal: '', timeLocal: '',

View File

@@ -86,7 +86,7 @@ export function sharedSingleStatPanelChangedHandler(
defaults.mappings = mappings; defaults.mappings = mappings;
} }
if (panel.gauge) { if (panel.gauge && panel.gauge.show) {
defaults.min = panel.gauge.minValue; defaults.min = panel.gauge.minValue;
defaults.max = panel.gauge.maxValue; defaults.max = panel.gauge.maxValue;
} }
@@ -151,11 +151,10 @@ export function sharedSingleStatMigrationHandler(panel: PanelModel<SingleStatBas
// Migrate color from simple string to a mode // Migrate color from simple string to a mode
const { defaults } = fieldOptions; const { defaults } = fieldOptions;
if (defaults.color) { if (defaults.color && typeof defaults.color === 'string') {
const old = defaults.color;
defaults.color = { defaults.color = {
mode: FieldColorMode.Fixed, mode: FieldColorMode.Fixed,
fixedColor: old, fixedColor: defaults.color,
}; };
} }

View File

@@ -158,7 +158,7 @@ export class UnthemedTimePicker extends PureComponent<Props, State> {
const hasAbsolute = isDateTime(value.raw.from) || isDateTime(value.raw.to); const hasAbsolute = isDateTime(value.raw.from) || isDateTime(value.raw.to);
const syncedTimePicker = timeSyncButton && isSynced; const syncedTimePicker = timeSyncButton && isSynced;
const timePickerIconClass = cx('fa fa-clock-o fa-fw', { ['icon-brand-gradient']: syncedTimePicker }); const timePickerIconClass = cx('fa fa-clock-o fa-fw', { ['icon-brand-gradient']: syncedTimePicker });
const timePickerButtonClass = cx('btn navbar-button navbar-button--zoom', { const timePickerButtonClass = cx('btn navbar-button navbar-button--tight', {
[`btn--radius-right-0 ${styles.noRightBorderStyle}`]: !!timeSyncButton, [`btn--radius-right-0 ${styles.noRightBorderStyle}`]: !!timeSyncButton,
[`explore-active-button-glow ${styles.syncedTimePicker}`]: syncedTimePicker, [`explore-active-button-glow ${styles.syncedTimePicker}`]: syncedTimePicker,
}); });

View File

@@ -10,7 +10,6 @@ import {
TIME_FORMAT, TIME_FORMAT,
} from '@grafana/data'; } from '@grafana/data';
import { stringToDateTimeType } from '../time'; import { stringToDateTimeType } from '../time';
import { isMathString } from '@grafana/data/src/datetime/datemath';
export const mapOptionToTimeRange = (option: TimeOption, timeZone?: TimeZone): TimeRange => { export const mapOptionToTimeRange = (option: TimeOption, timeZone?: TimeZone): TimeRange => {
return { return {
@@ -41,7 +40,7 @@ export const mapStringsToTimeRange = (from: string, to: string, roundup?: boolea
const fromDate = stringToDateTimeType(from, roundup, timeZone); const fromDate = stringToDateTimeType(from, roundup, timeZone);
const toDate = stringToDateTimeType(to, roundup, timeZone); const toDate = stringToDateTimeType(to, roundup, timeZone);
if (isMathString(from) || isMathString(to)) { if (dateMath.isMathString(from) || dateMath.isMathString(to)) {
return { return {
from: fromDate, from: fromDate,
to: toDate, to: toDate,

View File

@@ -35,7 +35,7 @@ exports[`TimePicker renders buttons correctly 1`] = `
> >
<button <button
aria-label="TimePicker Open Button" aria-label="TimePicker Open Button"
className="btn navbar-button navbar-button--zoom" className="btn navbar-button navbar-button--tight"
onClick={[Function]} onClick={[Function]}
> >
<i <i
@@ -342,7 +342,7 @@ exports[`TimePicker renders content correctly after beeing open 1`] = `
> >
<button <button
aria-label="TimePicker Open Button" aria-label="TimePicker Open Button"
className="btn navbar-button navbar-button--zoom" className="btn navbar-button navbar-button--tight"
onClick={[Function]} onClick={[Function]}
> >
<i <i

View File

@@ -14,3 +14,4 @@
@import 'TimePicker/TimeOfDayPicker'; @import 'TimePicker/TimeOfDayPicker';
@import 'Tooltip/Tooltip'; @import 'Tooltip/Tooltip';
@import 'ValueMappingsEditor/ValueMappingsEditor'; @import 'ValueMappingsEditor/ValueMappingsEditor';
@import 'Alert/Alert';

View File

@@ -3,6 +3,8 @@ package api
import ( import (
"strconv" "strconv"
"github.com/grafana/grafana/pkg/services/rendering"
"github.com/grafana/grafana/pkg/components/simplejson" "github.com/grafana/grafana/pkg/components/simplejson"
"github.com/grafana/grafana/pkg/util" "github.com/grafana/grafana/pkg/util"
@@ -194,6 +196,7 @@ func (hs *HTTPServer) getFrontendSettingsMap(c *m.ReqContext) (map[string]interf
"version": setting.BuildVersion, "version": setting.BuildVersion,
"commit": setting.BuildCommit, "commit": setting.BuildCommit,
"buildstamp": setting.BuildStamp, "buildstamp": setting.BuildStamp,
"edition": hs.License.Edition(),
"latestVersion": plugins.GrafanaLatestVersion, "latestVersion": plugins.GrafanaLatestVersion,
"hasUpdate": plugins.GrafanaHasUpdate, "hasUpdate": plugins.GrafanaHasUpdate,
"env": setting.Env, "env": setting.Env,
@@ -202,8 +205,11 @@ func (hs *HTTPServer) getFrontendSettingsMap(c *m.ReqContext) (map[string]interf
"licenseInfo": map[string]interface{}{ "licenseInfo": map[string]interface{}{
"hasLicense": hs.License.HasLicense(), "hasLicense": hs.License.HasLicense(),
"expiry": hs.License.Expiry(), "expiry": hs.License.Expiry(),
"stateInfo": hs.License.StateInfo(),
"licenseUrl": hs.License.LicenseURL(c.SignedInUser),
}, },
"featureToggles": hs.Cfg.FeatureToggles, "featureToggles": hs.Cfg.FeatureToggles,
"phantomJSRenderer": rendering.IsPhantomJSEnabled,
} }
return jsonObj, nil return jsonObj, nil

View File

@@ -275,7 +275,7 @@ func (hs *HTTPServer) setIndexViewData(c *m.ReqContext) (*dtos.IndexViewData, er
}) })
} }
if c.OrgRole == m.ROLE_ADMIN || hs.Cfg.EditorsCanAdmin { if c.OrgRole == m.ROLE_ADMIN || (hs.Cfg.EditorsCanAdmin && c.OrgRole == m.ROLE_EDITOR) {
configNodes = append(configNodes, &dtos.NavLink{ configNodes = append(configNodes, &dtos.NavLink{
Text: "Teams", Text: "Teams",
Id: "teams", Id: "teams",
@@ -357,7 +357,7 @@ func (hs *HTTPServer) setIndexViewData(c *m.ReqContext) (*dtos.IndexViewData, er
Children: []*dtos.NavLink{}, Children: []*dtos.NavLink{},
}) })
hs.HooksService.RunIndexDataHooks(&data) hs.HooksService.RunIndexDataHooks(&data, c)
sort.SliceStable(data.NavTree, func(i, j int) bool { sort.SliceStable(data.NavTree, func(i, j int) bool {
return data.NavTree[i].SortWeight < data.NavTree[j].SortWeight return data.NavTree[i].SortWeight < data.NavTree[j].SortWeight

View File

@@ -216,6 +216,7 @@ func (server *HTTPServer) PostSyncUserWithLDAP(c *models.ReqContext) Response {
} }
upsertCmd := &models.UpsertUserCommand{ upsertCmd := &models.UpsertUserCommand{
ReqContext: c,
ExternalUser: user, ExternalUser: user,
SignupAllowed: setting.LDAPAllowSignup, SignupAllowed: setting.LDAPAllowSignup,
} }

View File

@@ -154,6 +154,8 @@ var (
func init() { func init() {
httpStatusCodes := []string{"200", "404", "500", "unknown"} httpStatusCodes := []string{"200", "404", "500", "unknown"}
objectiveMap := map[float64]float64{0.5: 0.05, 0.9: 0.01, 0.99: 0.001}
MInstanceStart = prometheus.NewCounter(prometheus.CounterOpts{ MInstanceStart = prometheus.NewCounter(prometheus.CounterOpts{
Name: "instance_start_total", Name: "instance_start_total",
Help: "counter for started instances", Help: "counter for started instances",
@@ -193,6 +195,7 @@ func init() {
prometheus.SummaryOpts{ prometheus.SummaryOpts{
Name: "http_request_duration_milliseconds", Name: "http_request_duration_milliseconds",
Help: "http request summary", Help: "http request summary",
Objectives: objectiveMap,
}, },
[]string{"handler", "statuscode", "method"}, []string{"handler", "statuscode", "method"},
) )
@@ -218,18 +221,21 @@ func init() {
MApiDashboardSave = prometheus.NewSummary(prometheus.SummaryOpts{ MApiDashboardSave = prometheus.NewSummary(prometheus.SummaryOpts{
Name: "api_dashboard_save_milliseconds", Name: "api_dashboard_save_milliseconds",
Help: "summary for dashboard save duration", Help: "summary for dashboard save duration",
Objectives: objectiveMap,
Namespace: exporterName, Namespace: exporterName,
}) })
MApiDashboardGet = prometheus.NewSummary(prometheus.SummaryOpts{ MApiDashboardGet = prometheus.NewSummary(prometheus.SummaryOpts{
Name: "api_dashboard_get_milliseconds", Name: "api_dashboard_get_milliseconds",
Help: "summary for dashboard get duration", Help: "summary for dashboard get duration",
Objectives: objectiveMap,
Namespace: exporterName, Namespace: exporterName,
}) })
MApiDashboardSearch = prometheus.NewSummary(prometheus.SummaryOpts{ MApiDashboardSearch = prometheus.NewSummary(prometheus.SummaryOpts{
Name: "api_dashboard_search_milliseconds", Name: "api_dashboard_search_milliseconds",
Help: "summary for dashboard search duration", Help: "summary for dashboard search duration",
Objectives: objectiveMap,
Namespace: exporterName, Namespace: exporterName,
}) })
@@ -332,18 +338,21 @@ func init() {
LDAPUsersSyncExecutionTime = prometheus.NewSummary(prometheus.SummaryOpts{ LDAPUsersSyncExecutionTime = prometheus.NewSummary(prometheus.SummaryOpts{
Name: "ldap_users_sync_execution_time", Name: "ldap_users_sync_execution_time",
Help: "summary for LDAP users sync execution duration", Help: "summary for LDAP users sync execution duration",
Objectives: objectiveMap,
Namespace: exporterName, Namespace: exporterName,
}) })
MDataSourceProxyReqTimer = prometheus.NewSummary(prometheus.SummaryOpts{ MDataSourceProxyReqTimer = prometheus.NewSummary(prometheus.SummaryOpts{
Name: "api_dataproxy_request_all_milliseconds", Name: "api_dataproxy_request_all_milliseconds",
Help: "summary for dataproxy request duration", Help: "summary for dataproxy request duration",
Objectives: objectiveMap,
Namespace: exporterName, Namespace: exporterName,
}) })
MAlertingExecutionTime = prometheus.NewSummary(prometheus.SummaryOpts{ MAlertingExecutionTime = prometheus.NewSummary(prometheus.SummaryOpts{
Name: "alerting_execution_time_milliseconds", Name: "alerting_execution_time_milliseconds",
Help: "summary of alert exeuction duration", Help: "summary of alert exeuction duration",
Objectives: objectiveMap,
Namespace: exporterName, Namespace: exporterName,
}) })

View File

@@ -51,6 +51,7 @@ var loginUsingLDAP = func(query *models.LoginUserQuery) (bool, error) {
} }
upsert := &models.UpsertUserCommand{ upsert := &models.UpsertUserCommand{
ReqContext: query.ReqContext,
ExternalUser: externalUser, ExternalUser: externalUser,
SignupAllowed: setting.LDAPAllowSignup, SignupAllowed: setting.LDAPAllowSignup,
} }

View File

@@ -9,4 +9,11 @@ type Licensing interface {
// Expiry returns the unix epoch timestamp when the license expires, or 0 if no valid license is provided // Expiry returns the unix epoch timestamp when the license expires, or 0 if no valid license is provided
Expiry() int64 Expiry() int64
// Return edition
Edition() string
LicenseURL(user *SignedInUser) string
StateInfo() string
} }

View File

@@ -2,10 +2,11 @@ package hooks
import ( import (
"github.com/grafana/grafana/pkg/api/dtos" "github.com/grafana/grafana/pkg/api/dtos"
"github.com/grafana/grafana/pkg/models"
"github.com/grafana/grafana/pkg/registry" "github.com/grafana/grafana/pkg/registry"
) )
type IndexDataHook func(indexData *dtos.IndexViewData) type IndexDataHook func(indexData *dtos.IndexViewData, req *models.ReqContext)
type HooksService struct { type HooksService struct {
indexDataHooks []IndexDataHook indexDataHooks []IndexDataHook
@@ -23,8 +24,8 @@ func (srv *HooksService) AddIndexDataHook(hook IndexDataHook) {
srv.indexDataHooks = append(srv.indexDataHooks, hook) srv.indexDataHooks = append(srv.indexDataHooks, hook)
} }
func (srv *HooksService) RunIndexDataHooks(indexData *dtos.IndexViewData) { func (srv *HooksService) RunIndexDataHooks(indexData *dtos.IndexViewData, req *models.ReqContext) {
for _, hook := range srv.indexDataHooks { for _, hook := range srv.indexDataHooks {
hook(indexData) hook(indexData, req)
} }
} }

View File

@@ -2,6 +2,7 @@ package licensing
import ( import (
"github.com/grafana/grafana/pkg/api/dtos" "github.com/grafana/grafana/pkg/api/dtos"
"github.com/grafana/grafana/pkg/models"
"github.com/grafana/grafana/pkg/services/hooks" "github.com/grafana/grafana/pkg/services/hooks"
"github.com/grafana/grafana/pkg/setting" "github.com/grafana/grafana/pkg/setting"
) )
@@ -19,14 +20,30 @@ func (*OSSLicensingService) Expiry() int64 {
return 0 return 0
} }
func (*OSSLicensingService) Edition() string {
return "Open Source"
}
func (*OSSLicensingService) StateInfo() string {
return ""
}
func (l *OSSLicensingService) LicenseURL(user *models.SignedInUser) string {
if user.IsGrafanaAdmin {
return l.Cfg.AppSubUrl + "/admin/upgrading"
}
return "https://grafana.com/products/enterprise/?utm_source=grafana_footer"
}
func (l *OSSLicensingService) Init() error { func (l *OSSLicensingService) Init() error {
l.HooksService.AddIndexDataHook(func(indexData *dtos.IndexViewData) { l.HooksService.AddIndexDataHook(func(indexData *dtos.IndexViewData, req *models.ReqContext) {
for _, node := range indexData.NavTree { for _, node := range indexData.NavTree {
if node.Id == "admin" { if node.Id == "admin" {
node.Children = append(node.Children, &dtos.NavLink{ node.Children = append(node.Children, &dtos.NavLink{
Text: "Upgrade", Text: "Upgrade",
Id: "upgrading", Id: "upgrading",
Url: l.Cfg.AppSubUrl + "/admin/upgrading", Url: l.LicenseURL(req.SignedInUser),
Icon: "fa fa-fw fa-unlock-alt", Icon: "fa fa-fw fa-unlock-alt",
}) })
} }

View File

@@ -23,7 +23,12 @@ func (qs *QuotaService) QuotaReached(c *m.ReqContext, target string) (bool, erro
if !setting.Quota.Enabled { if !setting.Quota.Enabled {
return false, nil return false, nil
} }
// No request context means this is a background service, like LDAP Background Sync.
// TODO: we should replace the req context with a more limited interface or struct,
// something that we could easily provide from background jobs.
if c == nil {
return false, nil
}
// get the list of scopes that this target is valid for. Org, User, Global // get the list of scopes that this target is valid for. Org, User, Global
scopes, err := m.GetQuotaScopes(target) scopes, err := m.GetQuotaScopes(target)
if err != nil { if err != nil {

View File

@@ -20,6 +20,8 @@ func init() {
registry.RegisterService(&RenderingService{}) registry.RegisterService(&RenderingService{})
} }
var IsPhantomJSEnabled = false
type RenderingService struct { type RenderingService struct {
log log.Logger log log.Logger
pluginInfo *plugins.RendererPlugin pluginInfo *plugins.RendererPlugin
@@ -68,6 +70,7 @@ func (rs *RenderingService) Run(ctx context.Context) error {
rs.log.Warn("phantomJS is deprecated and will be removed in a future release. " + rs.log.Warn("phantomJS is deprecated and will be removed in a future release. " +
"You should consider migrating from phantomJS to grafana-image-renderer plugin.") "You should consider migrating from phantomJS to grafana-image-renderer plugin.")
rs.renderAction = rs.renderViaPhantomJS rs.renderAction = rs.renderViaPhantomJS
IsPhantomJSEnabled = true
<-ctx.Done() <-ctx.Done()
return nil return nil
} }

View File

@@ -144,46 +144,48 @@ func (r *SqlAnnotationRepo) Find(query *annotations.ItemQuery) ([]*annotations.I
FROM annotation FROM annotation
LEFT OUTER JOIN ` + dialect.Quote("user") + ` as usr on usr.id = annotation.user_id LEFT OUTER JOIN ` + dialect.Quote("user") + ` as usr on usr.id = annotation.user_id
LEFT OUTER JOIN alert on alert.id = annotation.alert_id LEFT OUTER JOIN alert on alert.id = annotation.alert_id
INNER JOIN (
SELECT a.id from annotation a
`) `)
sql.WriteString(`WHERE annotation.org_id = ?`) sql.WriteString(`WHERE a.org_id = ?`)
params = append(params, query.OrgId) params = append(params, query.OrgId)
if query.AnnotationId != 0 { if query.AnnotationId != 0 {
// fmt.Print("annotation query") // fmt.Print("annotation query")
sql.WriteString(` AND annotation.id = ?`) sql.WriteString(` AND a.id = ?`)
params = append(params, query.AnnotationId) params = append(params, query.AnnotationId)
} }
if query.AlertId != 0 { if query.AlertId != 0 {
sql.WriteString(` AND annotation.alert_id = ?`) sql.WriteString(` AND a.alert_id = ?`)
params = append(params, query.AlertId) params = append(params, query.AlertId)
} }
if query.DashboardId != 0 { if query.DashboardId != 0 {
sql.WriteString(` AND annotation.dashboard_id = ?`) sql.WriteString(` AND a.dashboard_id = ?`)
params = append(params, query.DashboardId) params = append(params, query.DashboardId)
} }
if query.PanelId != 0 { if query.PanelId != 0 {
sql.WriteString(` AND annotation.panel_id = ?`) sql.WriteString(` AND a.panel_id = ?`)
params = append(params, query.PanelId) params = append(params, query.PanelId)
} }
if query.UserId != 0 { if query.UserId != 0 {
sql.WriteString(` AND annotation.user_id = ?`) sql.WriteString(` AND a.user_id = ?`)
params = append(params, query.UserId) params = append(params, query.UserId)
} }
if query.From > 0 && query.To > 0 { if query.From > 0 && query.To > 0 {
sql.WriteString(` AND annotation.epoch <= ? AND annotation.epoch_end >= ?`) sql.WriteString(` AND a.epoch <= ? AND a.epoch_end >= ?`)
params = append(params, query.To, query.From) params = append(params, query.To, query.From)
} }
if query.Type == "alert" { if query.Type == "alert" {
sql.WriteString(` AND annotation.alert_id > 0`) sql.WriteString(` AND a.alert_id > 0`)
} else if query.Type == "annotation" { } else if query.Type == "annotation" {
sql.WriteString(` AND annotation.alert_id = 0`) sql.WriteString(` AND a.alert_id = 0`)
} }
if len(query.Tags) > 0 { if len(query.Tags) > 0 {
@@ -204,7 +206,7 @@ func (r *SqlAnnotationRepo) Find(query *annotations.ItemQuery) ([]*annotations.I
tagsSubQuery := fmt.Sprintf(` tagsSubQuery := fmt.Sprintf(`
SELECT SUM(1) FROM annotation_tag at SELECT SUM(1) FROM annotation_tag at
INNER JOIN tag on tag.id = at.tag_id INNER JOIN tag on tag.id = at.tag_id
WHERE at.annotation_id = annotation.id WHERE at.annotation_id = a.id
AND ( AND (
%s %s
) )
@@ -223,7 +225,8 @@ func (r *SqlAnnotationRepo) Find(query *annotations.ItemQuery) ([]*annotations.I
query.Limit = 100 query.Limit = 100
} }
sql.WriteString(" ORDER BY epoch DESC" + dialect.Limit(query.Limit)) // order of ORDER BY arguments match the order of a sql index for performance
sql.WriteString(" ORDER BY a.org_id, a.epoch_end DESC, a.epoch DESC" + dialect.Limit(query.Limit) + " ) dt on dt.id = annotation.id")
items := make([]*annotations.ItemDTO, 0) items := make([]*annotations.ItemDTO, 0)

View File

@@ -123,7 +123,28 @@ func addAnnotationMig(mg *Migrator) {
mg.AddMigration("Make epoch_end the same as epoch", NewRawSqlMigration("UPDATE annotation SET epoch_end = epoch")) mg.AddMigration("Make epoch_end the same as epoch", NewRawSqlMigration("UPDATE annotation SET epoch_end = epoch"))
mg.AddMigration("Move region to single row", &AddMakeRegionSingleRowMigration{}) mg.AddMigration("Move region to single row", &AddMakeRegionSingleRowMigration{})
// TODO! drop region_id column? //
// 6.6.1: Optimize annotation queries
//
mg.AddMigration("Remove index org_id_epoch from annotation table", NewDropIndexMigration(table, &Index{
Cols: []string{"org_id", "epoch"}, Type: IndexType,
}))
mg.AddMigration("Remove index org_id_dashboard_id_panel_id_epoch from annotation table", NewDropIndexMigration(table, &Index{
Cols: []string{"org_id", "dashboard_id", "panel_id", "epoch"}, Type: IndexType,
}))
mg.AddMigration("Add index for org_id_dashboard_id_epoch_end_epoch on annotation table", NewAddIndexMigration(table, &Index{
Cols: []string{"org_id", "dashboard_id", "epoch_end", "epoch"}, Type: IndexType,
}))
mg.AddMigration("Add index for org_id_epoch_end_epoch on annotation table", NewAddIndexMigration(table, &Index{
Cols: []string{"org_id", "epoch_end", "epoch"}, Type: IndexType,
}))
mg.AddMigration("Remove index org_id_epoch_epoch_end from annotation table", NewDropIndexMigration(table, &Index{
Cols: []string{"org_id", "epoch", "epoch_end"}, Type: IndexType,
}))
} }
type AddMakeRegionSingleRowMigration struct { type AddMakeRegionSingleRowMigration struct {

View File

@@ -6,6 +6,7 @@ import (
"strings" "strings"
"github.com/go-xorm/xorm" "github.com/go-xorm/xorm"
"github.com/grafana/grafana/pkg/util/errutil"
"github.com/lib/pq" "github.com/lib/pq"
) )
@@ -155,3 +156,15 @@ func (db *Postgres) IsUniqueConstraintViolation(err error) bool {
func (db *Postgres) IsDeadlock(err error) bool { func (db *Postgres) IsDeadlock(err error) bool {
return db.isThisError(err, "40P01") return db.isThisError(err, "40P01")
} }
func (db *Postgres) PostInsertId(table string, sess *xorm.Session) error {
if table != "org" {
return nil
}
// sync primary key sequence of org table
if _, err := sess.Exec("SELECT setval('org_id_seq', (SELECT max(id) FROM org));"); err != nil {
return errutil.Wrapf(err, "failed to sync primary key for org table")
}
return nil
}

View File

@@ -40,7 +40,6 @@ const (
PROD = "production" PROD = "production"
TEST = "test" TEST = "test"
APP_NAME = "Grafana" APP_NAME = "Grafana"
APP_NAME_ENTERPRISE = "Grafana Enterprise"
) )
var ( var (
@@ -619,9 +618,6 @@ func (cfg *Cfg) Load(args *CommandLineArgs) error {
Raw = cfg.Raw Raw = cfg.Raw
ApplicationName = APP_NAME ApplicationName = APP_NAME
if IsEnterprise {
ApplicationName = APP_NAME_ENTERPRISE
}
Env, err = valueAsString(iniFile.Section(""), "app_mode", "development") Env, err = valueAsString(iniFile.Section(""), "app_mode", "development")
if err != nil { if err != nil {

View File

@@ -58,6 +58,7 @@ func init() {
"AWS/DMS": {"CDCChangesDiskSource", "CDCChangesDiskTarget", "CDCChangesMemorySource", "CDCChangesMemoryTarget", "CDCIncomingChanges", "CDCLatencySource", "CDCLatencyTarget", "CDCThroughputBandwidthSource", "CDCThroughputBandwidthTarget", "CDCThroughputRowsSource", "CDCThroughputRowsTarget", "CPUUtilization", "FreeStorageSpace", "FreeableMemory", "FullLoadThroughputBandwidthSource", "FullLoadThroughputBandwidthTarget", "FullLoadThroughputRowsSource", "FullLoadThroughputRowsTarget", "NetworkReceiveThroughput", "NetworkTransmitThroughput", "ReadIOPS", "ReadLatency", "ReadThroughput", "SwapUsage", "WriteIOPS", "WriteLatency", "WriteThroughput"}, "AWS/DMS": {"CDCChangesDiskSource", "CDCChangesDiskTarget", "CDCChangesMemorySource", "CDCChangesMemoryTarget", "CDCIncomingChanges", "CDCLatencySource", "CDCLatencyTarget", "CDCThroughputBandwidthSource", "CDCThroughputBandwidthTarget", "CDCThroughputRowsSource", "CDCThroughputRowsTarget", "CPUUtilization", "FreeStorageSpace", "FreeableMemory", "FullLoadThroughputBandwidthSource", "FullLoadThroughputBandwidthTarget", "FullLoadThroughputRowsSource", "FullLoadThroughputRowsTarget", "NetworkReceiveThroughput", "NetworkTransmitThroughput", "ReadIOPS", "ReadLatency", "ReadThroughput", "SwapUsage", "WriteIOPS", "WriteLatency", "WriteThroughput"},
"AWS/DocDB": {"BackupRetentionPeriodStorageUsed", "BufferCacheHitRatio", "CPUUtilization", "DatabaseConnections", "DBInstanceReplicaLag", "DBClusterReplicaLagMaximum", "DBClusterReplicaLagMinimum", "DiskQueueDepth", "EngineUptime", "FreeableMemory", "FreeLocalStorage", "NetworkReceiveThroughput", "NetworkThroughput", "NetworkTransmitThroughput", "ReadIOPS", "ReadLatency", "ReadThroughput", "SnapshotStorageUsed", "SwapUsage", "TotalBackupStorageBilled", "VolumeBytesUsed", "VolumeReadIOPs", "VolumeWriteIOPs", "WriteIOPS", "WriteLatency", "WriteThroughput"}, "AWS/DocDB": {"BackupRetentionPeriodStorageUsed", "BufferCacheHitRatio", "CPUUtilization", "DatabaseConnections", "DBInstanceReplicaLag", "DBClusterReplicaLagMaximum", "DBClusterReplicaLagMinimum", "DiskQueueDepth", "EngineUptime", "FreeableMemory", "FreeLocalStorage", "NetworkReceiveThroughput", "NetworkThroughput", "NetworkTransmitThroughput", "ReadIOPS", "ReadLatency", "ReadThroughput", "SnapshotStorageUsed", "SwapUsage", "TotalBackupStorageBilled", "VolumeBytesUsed", "VolumeReadIOPs", "VolumeWriteIOPs", "WriteIOPS", "WriteLatency", "WriteThroughput"},
"AWS/DX": {"ConnectionBpsEgress", "ConnectionBpsIngress", "ConnectionCRCErrorCount", "ConnectionLightLevelRx", "ConnectionLightLevelTx", "ConnectionPpsEgress", "ConnectionPpsIngress", "ConnectionState"}, "AWS/DX": {"ConnectionBpsEgress", "ConnectionBpsIngress", "ConnectionCRCErrorCount", "ConnectionLightLevelRx", "ConnectionLightLevelTx", "ConnectionPpsEgress", "ConnectionPpsIngress", "ConnectionState"},
"AWS/DAX": {"CPUUtilization", "NetworkPacketsIn", "NetworkPacketsOut", "GetItemRequestCount", "BatchGetItemRequestCount", "BatchWriteItemRequestCount", "DeleteItemRequestCount", "PutItemRequestCount", "UpdateItemRequestCount", "TransactWriteItemsCount", "TransactGetItemsCount", "ItemCacheHits", "ItemCacheMisses", "QueryCacheHits", "QueryCacheMisses", "ScanCacheHits", "ScanCacheMisses", "TotalRequestCount", "ErrorRequestCount", "FaultRequestCount", "FailedRequestCount", "QueryRequestCount", "ScanRequestCount", "ClientConnections", "EstimatedDbSize", "EvictedSize"},
"AWS/DynamoDB": {"ConditionalCheckFailedRequests", "ConsumedReadCapacityUnits", "ConsumedWriteCapacityUnits", "OnlineIndexConsumedWriteCapacity", "OnlineIndexPercentageProgress", "OnlineIndexThrottleEvents", "PendingReplicationCount", "ProvisionedReadCapacityUnits", "ProvisionedWriteCapacityUnits", "ReadThrottleEvents", "ReplicationLatency", "ReturnedBytes", "ReturnedItemCount", "ReturnedRecordsCount", "SuccessfulRequestLatency", "SystemErrors", "ThrottledRequests", "TimeToLiveDeletedItemCount", "UserErrors", "WriteThrottleEvents"}, "AWS/DynamoDB": {"ConditionalCheckFailedRequests", "ConsumedReadCapacityUnits", "ConsumedWriteCapacityUnits", "OnlineIndexConsumedWriteCapacity", "OnlineIndexPercentageProgress", "OnlineIndexThrottleEvents", "PendingReplicationCount", "ProvisionedReadCapacityUnits", "ProvisionedWriteCapacityUnits", "ReadThrottleEvents", "ReplicationLatency", "ReturnedBytes", "ReturnedItemCount", "ReturnedRecordsCount", "SuccessfulRequestLatency", "SystemErrors", "ThrottledRequests", "TimeToLiveDeletedItemCount", "UserErrors", "WriteThrottleEvents"},
"AWS/EBS": {"BurstBalance", "VolumeConsumedReadWriteOps", "VolumeIdleTime", "VolumeQueueLength", "VolumeReadBytes", "VolumeReadOps", "VolumeThroughputPercentage", "VolumeTotalReadTime", "VolumeTotalWriteTime", "VolumeWriteBytes", "VolumeWriteOps"}, "AWS/EBS": {"BurstBalance", "VolumeConsumedReadWriteOps", "VolumeIdleTime", "VolumeQueueLength", "VolumeReadBytes", "VolumeReadOps", "VolumeThroughputPercentage", "VolumeTotalReadTime", "VolumeTotalWriteTime", "VolumeWriteBytes", "VolumeWriteOps"},
"AWS/EC2": {"CPUCreditBalance", "CPUCreditUsage", "CPUSurplusCreditBalance", "CPUSurplusCreditsCharged", "CPUUtilization", "DiskReadBytes", "DiskReadOps", "DiskWriteBytes", "DiskWriteOps", "EBSByteBalance%", "EBSIOBalance%", "EBSReadBytes", "EBSReadOps", "EBSWriteBytes", "EBSWriteOps", "NetworkIn", "NetworkOut", "NetworkPacketsIn", "NetworkPacketsOut", "StatusCheckFailed", "StatusCheckFailed_Instance", "StatusCheckFailed_System"}, "AWS/EC2": {"CPUCreditBalance", "CPUCreditUsage", "CPUSurplusCreditBalance", "CPUSurplusCreditsCharged", "CPUUtilization", "DiskReadBytes", "DiskReadOps", "DiskWriteBytes", "DiskWriteOps", "EBSByteBalance%", "EBSIOBalance%", "EBSReadBytes", "EBSReadOps", "EBSWriteBytes", "EBSWriteOps", "NetworkIn", "NetworkOut", "NetworkPacketsIn", "NetworkPacketsOut", "StatusCheckFailed", "StatusCheckFailed_Instance", "StatusCheckFailed_System"},
@@ -140,6 +141,7 @@ func init() {
"AWS/DMS": {"ReplicationInstanceIdentifier", "ReplicationTaskIdentifier"}, "AWS/DMS": {"ReplicationInstanceIdentifier", "ReplicationTaskIdentifier"},
"AWS/DocDB": {"DBClusterIdentifier"}, "AWS/DocDB": {"DBClusterIdentifier"},
"AWS/DX": {"ConnectionId"}, "AWS/DX": {"ConnectionId"},
"AWS/DAX": {"Account", "ClusterId", "NodeId"},
"AWS/DynamoDB": {"GlobalSecondaryIndexName", "Operation", "ReceivingRegion", "StreamLabel", "TableName"}, "AWS/DynamoDB": {"GlobalSecondaryIndexName", "Operation", "ReceivingRegion", "StreamLabel", "TableName"},
"AWS/EBS": {"VolumeId"}, "AWS/EBS": {"VolumeId"},
"AWS/EC2": {"AutoScalingGroupName", "ImageId", "InstanceId", "InstanceType"}, "AWS/EC2": {"AutoScalingGroupName", "ImageId", "InstanceId", "InstanceType"},

View File

@@ -100,7 +100,7 @@ func (e *CloudWatchExecutor) transformQueryResponseToQueryResult(cloudwatchRespo
} }
if partialData { if partialData {
queryResult.ErrorString = "Cloudwatch GetMetricData error: Too many datapoints requested - your search have been limited. Please try to reduce the time range" queryResult.ErrorString = "Cloudwatch GetMetricData error: Too many datapoints requested - your search has been limited. Please try to reduce the time range"
} }
queryResult.Series = append(queryResult.Series, timeSeries...) queryResult.Series = append(queryResult.Series, timeSeries...)

View File

@@ -68,8 +68,15 @@ func parseRequestQuery(model *simplejson.Json, refId string, startTime time.Time
var period int var period int
if strings.ToLower(p) == "auto" || p == "" { if strings.ToLower(p) == "auto" || p == "" {
deltaInSeconds := endTime.Sub(startTime).Seconds() deltaInSeconds := endTime.Sub(startTime).Seconds()
periods := []int{60, 300, 900, 3600, 21600} periods := []int{60, 300, 900, 3600, 21600, 86400}
period = closest(periods, int(math.Ceil(deltaInSeconds/2000))) datapoints := int(math.Ceil(deltaInSeconds / 2000))
period = periods[len(periods)-1]
for _, value := range periods {
if datapoints <= value {
period = value
break
}
}
} else { } else {
if regexp.MustCompile(`^\d+$`).Match([]byte(p)) { if regexp.MustCompile(`^\d+$`).Match([]byte(p)) {
period, err = strconv.Atoi(p) period, err = strconv.Atoi(p)
@@ -158,25 +165,3 @@ func sortDimensions(dimensions map[string][]string) map[string][]string {
} }
return sortedDimensions return sortedDimensions
} }
func closest(array []int, num int) int {
minDiff := array[len(array)-1]
var closest int
if num <= array[0] {
return array[0]
}
if num >= array[len(array)-1] {
return array[len(array)-1]
}
for _, value := range array {
var m = int(math.Abs(float64(num - value)))
if m <= minDiff {
minDiff = m
closest = value
}
}
return closest
}

View File

@@ -2,6 +2,7 @@ package cloudwatch
import ( import (
"testing" "testing"
"time"
"github.com/grafana/grafana/pkg/components/simplejson" "github.com/grafana/grafana/pkg/components/simplejson"
"github.com/grafana/grafana/pkg/tsdb" "github.com/grafana/grafana/pkg/tsdb"
@@ -127,66 +128,86 @@ func TestRequestParser(t *testing.T) {
"period": "auto", "period": "auto",
}) })
Convey("when time range is short", func() { Convey("when time range is 5 minutes", func() {
query.Set("period", "auto") query.Set("period", "auto")
timeRange := tsdb.NewTimeRange("now-2h", "now-1h") to := time.Now()
from, _ := timeRange.ParseFrom() from := to.Local().Add(time.Minute * time.Duration(5))
to, _ := timeRange.ParseTo()
res, err := parseRequestQuery(query, "ref1", from, to) res, err := parseRequestQuery(query, "ref1", from, to)
So(err, ShouldBeNil) So(err, ShouldBeNil)
So(res.Period, ShouldEqual, 60) So(res.Period, ShouldEqual, 60)
}) })
Convey("when time range is 5y", func() { Convey("when time range is 1 day", func() {
timeRange := tsdb.NewTimeRange("now-5y", "now") query.Set("period", "auto")
from, _ := timeRange.ParseFrom() to := time.Now()
to, _ := timeRange.ParseTo() from := to.AddDate(0, 0, -1)
res, err := parseRequestQuery(query, "ref1", from, to)
So(err, ShouldBeNil)
So(res.Period, ShouldEqual, 60)
})
Convey("when time range is 2 days", func() {
query.Set("period", "auto")
to := time.Now()
from := to.AddDate(0, 0, -2)
res, err := parseRequestQuery(query, "ref1", from, to)
So(err, ShouldBeNil)
So(res.Period, ShouldEqual, 300)
})
Convey("when time range is 7 days", func() {
query.Set("period", "auto")
to := time.Now()
from := to.AddDate(0, 0, -7)
res, err := parseRequestQuery(query, "ref1", from, to)
So(err, ShouldBeNil)
So(res.Period, ShouldEqual, 900)
})
Convey("when time range is 30 days", func() {
query.Set("period", "auto")
to := time.Now()
from := to.AddDate(0, 0, -30)
res, err := parseRequestQuery(query, "ref1", from, to)
So(err, ShouldBeNil)
So(res.Period, ShouldEqual, 3600)
})
Convey("when time range is 90 days", func() {
query.Set("period", "auto")
to := time.Now()
from := to.AddDate(0, 0, -90)
res, err := parseRequestQuery(query, "ref1", from, to) res, err := parseRequestQuery(query, "ref1", from, to)
So(err, ShouldBeNil) So(err, ShouldBeNil)
So(res.Period, ShouldEqual, 21600) So(res.Period, ShouldEqual, 21600)
}) })
Convey("when time range is 1 year", func() {
query.Set("period", "auto")
to := time.Now()
from := to.AddDate(-1, 0, 0)
res, err := parseRequestQuery(query, "ref1", from, to)
So(err, ShouldBeNil)
So(res.Period, ShouldEqual, 21600)
}) })
Convey("closest works as expected", func() { Convey("when time range is 2 years", func() {
periods := []int{60, 300, 900, 3600, 21600} query.Set("period", "auto")
Convey("and input is lower than 60", func() { to := time.Now()
So(closest(periods, 6), ShouldEqual, 60) from := to.AddDate(-2, 0, 0)
res, err := parseRequestQuery(query, "ref1", from, to)
So(err, ShouldBeNil)
So(res.Period, ShouldEqual, 86400)
})
}) })
Convey("and input is exactly 60", func() {
So(closest(periods, 60), ShouldEqual, 60)
})
Convey("and input is exactly between two steps", func() {
So(closest(periods, 180), ShouldEqual, 300)
})
Convey("and input is exactly 2000", func() {
So(closest(periods, 2000), ShouldEqual, 900)
})
Convey("and input is exactly 5000", func() {
So(closest(periods, 5000), ShouldEqual, 3600)
})
Convey("and input is exactly 50000", func() {
So(closest(periods, 50000), ShouldEqual, 21600)
})
Convey("and period isn't shorter than min retension for 15 days", func() {
So(closest(periods, (60*60*24*15)+1/2000), ShouldBeGreaterThanOrEqualTo, 300)
})
Convey("and period isn't shorter than min retension for 63 days", func() {
So(closest(periods, (60*60*24*63)+1/2000), ShouldBeGreaterThanOrEqualTo, 3600)
})
Convey("and period isn't shorter than min retension for 455 days", func() {
So(closest(periods, (60*60*24*455)+1/2000), ShouldBeGreaterThanOrEqualTo, 21600)
})
})
}) })
}) })
} }

View File

@@ -100,8 +100,10 @@ func parseGetMetricDataTimeSeries(metricDataResults map[string]*cloudwatch.Metri
series.Tags[key] = values[0] series.Tags[key] = values[0]
} else { } else {
for _, value := range values { for _, value := range values {
if value == label || value == "*" || strings.Contains(label, value) { if value == label || value == "*" {
series.Tags[key] = label series.Tags[key] = label
} else if strings.Contains(label, value) {
series.Tags[key] = value
} }
} }
} }

View File

@@ -53,7 +53,7 @@ func TestCloudWatchResponseParser(t *testing.T) {
Namespace: "AWS/ApplicationELB", Namespace: "AWS/ApplicationELB",
MetricName: "TargetResponseTime", MetricName: "TargetResponseTime",
Dimensions: map[string][]string{ Dimensions: map[string][]string{
"LoadBalancer": {"lb2"}, "LoadBalancer": {"lb1", "lb2"},
"TargetGroup": {"tg"}, "TargetGroup": {"tg"},
}, },
Stats: "Average", Stats: "Average",
@@ -65,8 +65,12 @@ func TestCloudWatchResponseParser(t *testing.T) {
So(err, ShouldBeNil) So(err, ShouldBeNil)
So(partialData, ShouldBeFalse) So(partialData, ShouldBeFalse)
So(timeSeries.Name, ShouldEqual, "lb2 Expanded") So(timeSeries.Name, ShouldEqual, "lb1 Expanded")
So(timeSeries.Tags["LoadBalancer"], ShouldEqual, "lb2") So(timeSeries.Tags["LoadBalancer"], ShouldEqual, "lb1")
timeSeries2 := (*series)[1]
So(timeSeries2.Name, ShouldEqual, "lb2 Expanded")
So(timeSeries2.Tags["LoadBalancer"], ShouldEqual, "lb2")
}) })
Convey("can expand dimension value using substring", func() { Convey("can expand dimension value using substring", func() {
@@ -110,7 +114,7 @@ func TestCloudWatchResponseParser(t *testing.T) {
Namespace: "AWS/ApplicationELB", Namespace: "AWS/ApplicationELB",
MetricName: "TargetResponseTime", MetricName: "TargetResponseTime",
Dimensions: map[string][]string{ Dimensions: map[string][]string{
"LoadBalancer": {"lb1"}, "LoadBalancer": {"lb1", "lb2"},
"TargetGroup": {"tg"}, "TargetGroup": {"tg"},
}, },
Stats: "Average", Stats: "Average",
@@ -119,11 +123,14 @@ func TestCloudWatchResponseParser(t *testing.T) {
} }
series, partialData, err := parseGetMetricDataTimeSeries(resp, query) series, partialData, err := parseGetMetricDataTimeSeries(resp, query)
timeSeries := (*series)[0] timeSeries := (*series)[0]
So(err, ShouldBeNil) So(err, ShouldBeNil)
So(partialData, ShouldBeFalse) So(partialData, ShouldBeFalse)
So(timeSeries.Name, ShouldEqual, "lb1 Expanded") So(timeSeries.Name, ShouldEqual, "lb1 Expanded")
So(timeSeries.Tags["LoadBalancer"], ShouldEqual, "lb1") So(timeSeries.Tags["LoadBalancer"], ShouldEqual, "lb1")
timeSeries2 := (*series)[1]
So(timeSeries2.Name, ShouldEqual, "lb2 Expanded")
So(timeSeries2.Tags["LoadBalancer"], ShouldEqual, "lb2")
}) })
Convey("can expand dimension value using wildcard", func() { Convey("can expand dimension value using wildcard", func() {

View File

@@ -542,8 +542,11 @@ func getRandomWalk(query *tsdb.Query, tsdbQuery *tsdb.TsdbQuery, index int) *tsd
startValue := query.Model.Get("startValue").MustFloat64(rand.Float64() * 100) startValue := query.Model.Get("startValue").MustFloat64(rand.Float64() * 100)
spread := query.Model.Get("spread").MustFloat64(1) spread := query.Model.Get("spread").MustFloat64(1)
noise := query.Model.Get("noise").MustFloat64(0) noise := query.Model.Get("noise").MustFloat64(0)
min, hasMin := query.Model.Get("min").Float64()
max, hasMax := query.Model.Get("max").Float64() min, err := query.Model.Get("min").Float64()
hasMin := err == nil
max, err := query.Model.Get("max").Float64()
hasMax := err == nil
points := make(tsdb.TimeSeriesPoints, 0) points := make(tsdb.TimeSeriesPoints, 0)
walker := startValue walker := startValue
@@ -551,12 +554,12 @@ func getRandomWalk(query *tsdb.Query, tsdbQuery *tsdb.TsdbQuery, index int) *tsd
for i := int64(0); i < 10000 && timeWalkerMs < to; i++ { for i := int64(0); i < 10000 && timeWalkerMs < to; i++ {
nextValue := walker + (rand.Float64() * noise) nextValue := walker + (rand.Float64() * noise)
if hasMin == nil && nextValue < min { if hasMin && nextValue < min {
nextValue = min nextValue = min
walker = min walker = min
} }
if hasMax == nil && nextValue > max { if hasMax && nextValue > max {
nextValue = max nextValue = max
walker = max walker = max
} }

View File

@@ -1,4 +1,5 @@
import '@babel/polyfill'; import '@babel/polyfill';
import 'url-search-params-polyfill'; // fetch polyfill needed for PhantomJs rendering
import 'file-saver'; import 'file-saver';
import 'lodash'; import 'lodash';
import 'jquery'; import 'jquery';

View File

@@ -7,9 +7,13 @@ export interface BrandComponentProps {
} }
export const LoginLogo: FC<BrandComponentProps> = ({ className }) => { export const LoginLogo: FC<BrandComponentProps> = ({ className }) => {
const maxSize = css`
max-width: 150px;
`;
return ( return (
<> <>
<img className={className} src="public/img/grafana_icon.svg" alt="Grafana" /> <img className={cx(className, maxSize)} src="public/img/grafana_icon.svg" alt="Grafana" />
<div className="logo-wordmark" /> <div className="logo-wordmark" />
</> </>
); );

View File

@@ -5,7 +5,7 @@ export interface FooterLink {
text: string; text: string;
icon?: string; icon?: string;
url?: string; url?: string;
target: string; target?: string;
} }
export let getFooterLinks = (): FooterLink[] => { export let getFooterLinks = (): FooterLink[] => {
@@ -17,7 +17,7 @@ export let getFooterLinks = (): FooterLink[] => {
target: '_blank', target: '_blank',
}, },
{ {
text: 'Support & Enterprise', text: 'Support',
icon: 'fa fa-support', icon: 'fa fa-support',
url: 'https://grafana.com/products/enterprise/?utm_source=grafana_footer', url: 'https://grafana.com/products/enterprise/?utm_source=grafana_footer',
target: '_blank', target: '_blank',
@@ -32,15 +32,12 @@ export let getFooterLinks = (): FooterLink[] => {
}; };
export let getVersionLinks = (): FooterLink[] => { export let getVersionLinks = (): FooterLink[] => {
const { buildInfo } = config; const { buildInfo, licenseInfo } = config;
const links: FooterLink[] = [];
const stateInfo = licenseInfo.stateInfo ? ` (${licenseInfo.stateInfo})` : '';
const links: FooterLink[] = [ links.push({ text: `${buildInfo.edition}${stateInfo}`, url: licenseInfo.licenseUrl });
{ links.push({ text: `v${buildInfo.version} (${buildInfo.commit})` });
text: `Grafana v${buildInfo.version} (commit: ${buildInfo.commit})`,
url: 'https://grafana.com',
target: '_blank',
},
];
if (buildInfo.hasUpdate) { if (buildInfo.hasUpdate) {
links.push({ links.push({

View File

@@ -103,7 +103,7 @@ export class LoginForm extends PureComponent<Props, State> {
Log In Log In
</button> </button>
) : ( ) : (
<button type="submit" className="btn btn-large p-x-2 btn-inverse btn-loading"> <button type="submit" disabled className="btn btn-large p-x-2 btn-inverse btn-loading">
Logging In<span>.</span> Logging In<span>.</span>
<span>.</span> <span>.</span>
<span>.</span> <span>.</span>

View File

@@ -9,13 +9,14 @@ import LoginCtrl from './LoginCtrl';
import { LoginForm } from './LoginForm'; import { LoginForm } from './LoginForm';
import { ChangePassword } from './ChangePassword'; import { ChangePassword } from './ChangePassword';
import { Branding } from 'app/core/components/Branding/Branding'; import { Branding } from 'app/core/components/Branding/Branding';
import { Footer } from 'app/core/components/Footer/Footer';
export const LoginPage: FC = () => { export const LoginPage: FC = () => {
return ( return (
<Branding.LoginBackground className="login container"> <Branding.LoginBackground className="login container">
<div className="login-content"> <div className="login-content">
<div className="login-branding"> <div className="login-branding">
<Branding.LoginLogo className="logo-icon" /> <Branding.LoginLogo className="login-logo" />
</div> </div>
<LoginCtrl> <LoginCtrl>
{({ {({
@@ -62,6 +63,7 @@ export const LoginPage: FC = () => {
<div className="clearfix" /> <div className="clearfix" />
</div> </div>
<Footer />
</Branding.LoginBackground> </Branding.LoginBackground>
); );
}; };

View File

@@ -60,6 +60,7 @@ const Navigation = ({ main }: { main: NavModelItem }) => {
<TabsBar className="page-header__tabs" hideBorder={true}> <TabsBar className="page-header__tabs" hideBorder={true}>
{main.children.map((child, index) => { {main.children.map((child, index) => {
return ( return (
!child.hideFromTabs && (
<Tab <Tab
label={child.text} label={child.text}
active={child.active} active={child.active}
@@ -67,6 +68,7 @@ const Navigation = ({ main }: { main: NavModelItem }) => {
icon={child.icon} icon={child.icon}
onChangeTab={() => goToUrl(index)} onChangeTab={() => goToUrl(index)}
/> />
)
); );
})} })}
</TabsBar> </TabsBar>

View File

@@ -1,6 +1,7 @@
import _ from 'lodash'; import _ from 'lodash';
import $ from 'jquery'; import $ from 'jquery';
import coreModule from 'app/core/core_module'; import coreModule from 'app/core/core_module';
import { promiseToDigest } from '../../utils/promiseToDigest';
const template = ` const template = `
<div class="dropdown cascade-open"> <div class="dropdown cascade-open">
@@ -138,9 +139,11 @@ export function queryPartEditorDirective(templateSrv: any) {
} }
$scope.showActionsMenu = () => { $scope.showActionsMenu = () => {
promiseToDigest($scope)(
$scope.handleEvent({ $event: { name: 'get-part-actions' } }).then((res: any) => { $scope.handleEvent({ $event: { name: 'get-part-actions' } }).then((res: any) => {
$scope.partActions = res; $scope.partActions = res;
}); })
);
}; };
$scope.triggerPartAction = (action: string) => { $scope.triggerPartAction = (action: string) => {

View File

@@ -172,7 +172,7 @@ export function sqlPartEditorDirective(templateSrv: any) {
} }
const paramValue = templateSrv.highlightVariablesAsHtml(part.params[index]); const paramValue = templateSrv.highlightVariablesAsHtml(part.params[index]);
const $paramLink = $('<a class="graphite-func-param-link pointer">' + paramValue + '</a>'); const $paramLink = $('<a class="query-part__link">' + paramValue + '</a>');
const $input = $(paramTemplate); const $input = $(paramTemplate);
$paramLink.appendTo($paramsContainer); $paramLink.appendTo($paramsContainer);

View File

@@ -223,7 +223,6 @@ describe('dataFrameToLogsModel', () => {
expect(logsModel.rows).toHaveLength(2); expect(logsModel.rows).toHaveLength(2);
expect(logsModel.rows).toMatchObject([ expect(logsModel.rows).toMatchObject([
{ {
timestamp: '2019-04-26T09:28:11.352440161Z',
entry: 't=2019-04-26T11:05:28+0200 lvl=info msg="Initializing DatasourceCacheService" logger=server', entry: 't=2019-04-26T11:05:28+0200 lvl=info msg="Initializing DatasourceCacheService" logger=server',
labels: { filename: '/var/log/grafana/grafana.log', job: 'grafana' }, labels: { filename: '/var/log/grafana/grafana.log', job: 'grafana' },
logLevel: 'info', logLevel: 'info',
@@ -231,7 +230,6 @@ describe('dataFrameToLogsModel', () => {
uid: 'foo', uid: 'foo',
}, },
{ {
timestamp: '2019-04-26T14:42:50.991981292Z',
entry: 't=2019-04-26T16:42:50+0200 lvl=eror msg="new token…t unhashed token=56d9fdc5c8b7400bd51b060eea8ca9d7', entry: 't=2019-04-26T16:42:50+0200 lvl=eror msg="new token…t unhashed token=56d9fdc5c8b7400bd51b060eea8ca9d7',
labels: { filename: '/var/log/grafana/grafana.log', job: 'grafana' }, labels: { filename: '/var/log/grafana/grafana.log', job: 'grafana' },
logLevel: 'error', logLevel: 'error',

View File

@@ -328,14 +328,13 @@ export function logSeriesToLogsModel(logSeries: DataFrame[]): LogsModel | undefi
timeFromNow: time.fromNow(), timeFromNow: time.fromNow(),
timeEpochMs: time.valueOf(), timeEpochMs: time.valueOf(),
timeLocal: time.format(logTimeFormat), timeLocal: time.format(logTimeFormat),
timeUtc: toUtc(ts).format(logTimeFormat), timeUtc: toUtc(time.valueOf()).format(logTimeFormat),
uniqueLabels, uniqueLabels,
hasAnsi, hasAnsi,
searchWords, searchWords,
entry: hasAnsi ? ansicolor.strip(message) : message, entry: hasAnsi ? ansicolor.strip(message) : message,
raw: message, raw: message,
labels: stringField.labels, labels: stringField.labels,
timestamp: ts,
uid: idField ? idField.values.get(j) : j.toString(), uid: idField ? idField.values.get(j) : j.toString(),
}); });
} }

View File

@@ -15,7 +15,15 @@ import {
} from './explore'; } from './explore';
import { ExploreUrlState, ExploreMode } from 'app/types/explore'; import { ExploreUrlState, ExploreMode } from 'app/types/explore';
import store from 'app/core/store'; import store from 'app/core/store';
import { DataQueryError, LogsDedupStrategy, LogsModel, LogLevel, dateTime, MutableDataFrame } from '@grafana/data'; import {
DataQueryError,
LogsDedupStrategy,
LogsModel,
LogLevel,
dateTime,
MutableDataFrame,
LogRowModel,
} from '@grafana/data';
import { RefreshPicker } from '@grafana/ui'; import { RefreshPicker } from '@grafana/ui';
const DEFAULT_EXPLORE_STATE: ExploreUrlState = { const DEFAULT_EXPLORE_STATE: ExploreUrlState = {
@@ -372,11 +380,10 @@ describe('refreshIntervalToSortOrder', () => {
}); });
describe('sortLogsResult', () => { describe('sortLogsResult', () => {
const firstRow = { const firstRow: LogRowModel = {
rowIndex: 0, rowIndex: 0,
entryFieldIndex: 0, entryFieldIndex: 0,
dataFrame: new MutableDataFrame(), dataFrame: new MutableDataFrame(),
timestamp: '2019-01-01T21:00:0.0000000Z',
entry: '', entry: '',
hasAnsi: false, hasAnsi: false,
labels: {}, labels: {},
@@ -389,17 +396,16 @@ describe('sortLogsResult', () => {
uid: '1', uid: '1',
}; };
const sameAsFirstRow = firstRow; const sameAsFirstRow = firstRow;
const secondRow = { const secondRow: LogRowModel = {
rowIndex: 1, rowIndex: 1,
entryFieldIndex: 0, entryFieldIndex: 0,
dataFrame: new MutableDataFrame(), dataFrame: new MutableDataFrame(),
timestamp: '2019-01-01T22:00:0.0000000Z',
entry: '', entry: '',
hasAnsi: false, hasAnsi: false,
labels: {}, labels: {},
logLevel: LogLevel.info, logLevel: LogLevel.info,
raw: '', raw: '',
timeEpochMs: 0, timeEpochMs: 10,
timeFromNow: '', timeFromNow: '',
timeLocal: '', timeLocal: '',
timeUtc: '', timeUtc: '',

View File

@@ -88,11 +88,12 @@ export async function getExploreUrl(args: GetExploreUrlArguments) {
const range = timeSrv.timeRangeForUrl(); const range = timeSrv.timeRangeForUrl();
let state: Partial<ExploreUrlState> = { range }; let state: Partial<ExploreUrlState> = { range };
if (exploreDatasource.interpolateVariablesInQueries) { if (exploreDatasource.interpolateVariablesInQueries) {
const scopedVars = panel.scopedVars || {};
state = { state = {
...state, ...state,
datasource: exploreDatasource.name, datasource: exploreDatasource.name,
context: 'explore', context: 'explore',
queries: exploreDatasource.interpolateVariablesInQueries(exploreTargets), queries: exploreDatasource.interpolateVariablesInQueries(exploreTargets, scopedVars),
}; };
} else { } else {
state = { state = {
@@ -473,11 +474,11 @@ export const getRefIds = (value: any): string[] => {
}; };
export const sortInAscendingOrder = (a: LogRowModel, b: LogRowModel) => { export const sortInAscendingOrder = (a: LogRowModel, b: LogRowModel) => {
if (a.timestamp < b.timestamp) { if (a.timeEpochMs < b.timeEpochMs) {
return -1; return -1;
} }
if (a.timestamp > b.timestamp) { if (a.timeEpochMs > b.timeEpochMs) {
return 1; return 1;
} }
@@ -485,11 +486,11 @@ export const sortInAscendingOrder = (a: LogRowModel, b: LogRowModel) => {
}; };
const sortInDescendingOrder = (a: LogRowModel, b: LogRowModel) => { const sortInDescendingOrder = (a: LogRowModel, b: LogRowModel) => {
if (a.timestamp > b.timestamp) { if (a.timeEpochMs > b.timeEpochMs) {
return -1; return -1;
} }
if (a.timestamp < b.timestamp) { if (a.timeEpochMs < b.timeEpochMs) {
return 1; return 1;
} }

View File

@@ -57,7 +57,6 @@ export const LicenseChrome: React.FC<Props> = ({ header, editionNotice, subheade
}} }}
> >
<img <img
className="logo-icon"
src="/public/img/grafana_icon.svg" src="/public/img/grafana_icon.svg"
alt="Grafana" alt="Grafana"
width="80px" width="80px"

View File

@@ -180,7 +180,7 @@ exports[`ServerStats Should render table with stats 1`] = `
className="fa fa-support" className="fa fa-support"
/> />
Support & Enterprise Support
</a> </a>
</li> </li>
<li> <li>
@@ -198,13 +198,22 @@ exports[`ServerStats Should render table with stats 1`] = `
</li> </li>
<li> <li>
<a <a
href="https://grafana.com"
rel="noopener" rel="noopener"
target="_blank" target="_blank"
> >
<i /> <i />
Grafana vv1.0 (commit: 1) undefined
</a>
</li>
<li>
<a
rel="noopener"
target="_blank"
>
<i />
vv1.0 (1)
</a> </a>
</li> </li>
</ul> </ul>

View File

@@ -10,6 +10,7 @@ import { CoreEvents } from 'app/types';
import { GrafanaRootScope } from 'app/routes/GrafanaCtrl'; import { GrafanaRootScope } from 'app/routes/GrafanaCtrl';
import { AppEvents } from '@grafana/data'; import { AppEvents } from '@grafana/data';
import { e2e } from '@grafana/e2e'; import { e2e } from '@grafana/e2e';
import locationUtil from 'app/core/utils/location_util';
export class SettingsCtrl { export class SettingsCtrl {
dashboard: DashboardModel; dashboard: DashboardModel;
@@ -183,7 +184,7 @@ export class SettingsCtrl {
this.buildSectionList(); this.buildSectionList();
const currentSection: any = _.find(this.sections, { id: this.viewId } as any); const currentSection: any = _.find(this.sections, { id: this.viewId } as any);
this.$location.url(currentSection.url); this.$location.url(locationUtil.stripBaseFromUrl(currentSection.url));
} }
deleteDashboard() { deleteDashboard() {

View File

@@ -24,6 +24,8 @@ const setup = (propOverrides?: object) => {
loadDataSource: jest.fn(), loadDataSource: jest.fn(),
setDataSourceName, setDataSourceName,
updateDataSource: jest.fn(), updateDataSource: jest.fn(),
initDataSourceSettings: jest.fn(),
testDataSource: jest.fn(),
setIsDefault, setIsDefault,
dataSourceLoaded, dataSourceLoaded,
query: {}, query: {},

View File

@@ -1,7 +1,6 @@
// Libraries // Libraries
import React, { PureComponent } from 'react'; import React, { PureComponent } from 'react';
import { hot } from 'react-hot-loader'; import { hot } from 'react-hot-loader';
import { connect } from 'react-redux';
import isString from 'lodash/isString'; import isString from 'lodash/isString';
import { e2e } from '@grafana/e2e'; import { e2e } from '@grafana/e2e';
// Components // Components
@@ -11,11 +10,15 @@ import BasicSettings from './BasicSettings';
import ButtonRow from './ButtonRow'; import ButtonRow from './ButtonRow';
// Services & Utils // Services & Utils
import appEvents from 'app/core/app_events'; import appEvents from 'app/core/app_events';
import { getBackendSrv } from 'app/core/services/backend_srv';
import { getDatasourceSrv } from 'app/features/plugins/datasource_srv';
// Actions & selectors // Actions & selectors
import { getDataSource, getDataSourceMeta } from '../state/selectors'; import { getDataSource, getDataSourceMeta } from '../state/selectors';
import { deleteDataSource, loadDataSource, updateDataSource } from '../state/actions'; import {
deleteDataSource,
loadDataSource,
updateDataSource,
initDataSourceSettings,
testDataSource,
} from '../state/actions';
import { getNavModel } from 'app/core/selectors/navModel'; import { getNavModel } from 'app/core/selectors/navModel';
import { getRouteParamsId } from 'app/core/selectors/location'; import { getRouteParamsId } from 'app/core/selectors/location';
// Types // Types
@@ -24,8 +27,8 @@ import { UrlQueryMap } from '@grafana/runtime';
import { DataSourcePluginMeta, DataSourceSettings, NavModel } from '@grafana/data'; import { DataSourcePluginMeta, DataSourceSettings, NavModel } from '@grafana/data';
import { getDataSourceLoadingNav } from '../state/navModel'; import { getDataSourceLoadingNav } from '../state/navModel';
import PluginStateinfo from 'app/features/plugins/PluginStateInfo'; import PluginStateinfo from 'app/features/plugins/PluginStateInfo';
import { importDataSourcePlugin } from 'app/features/plugins/plugin_loader';
import { dataSourceLoaded, setDataSourceName, setIsDefault } from '../state/reducers'; import { dataSourceLoaded, setDataSourceName, setIsDefault } from '../state/reducers';
import { connectWithCleanUp } from 'app/core/components/connectWithCleanUp';
export interface Props { export interface Props {
navModel: NavModel; navModel: NavModel;
@@ -38,55 +41,22 @@ export interface Props {
updateDataSource: typeof updateDataSource; updateDataSource: typeof updateDataSource;
setIsDefault: typeof setIsDefault; setIsDefault: typeof setIsDefault;
dataSourceLoaded: typeof dataSourceLoaded; dataSourceLoaded: typeof dataSourceLoaded;
initDataSourceSettings: typeof initDataSourceSettings;
testDataSource: typeof testDataSource;
plugin?: GenericDataSourcePlugin; plugin?: GenericDataSourcePlugin;
query: UrlQueryMap; query: UrlQueryMap;
page?: string; page?: string;
} testingStatus?: {
message?: string;
interface State { status?: string;
plugin?: GenericDataSourcePlugin;
isTesting?: boolean;
testingMessage?: string;
testingStatus?: string;
loadError?: any;
}
export class DataSourceSettingsPage extends PureComponent<Props, State> {
constructor(props: Props) {
super(props);
this.state = {
plugin: props.plugin,
}; };
loadError?: Error | string;
} }
async loadPlugin(pluginId?: string) { export class DataSourceSettingsPage extends PureComponent<Props> {
const { dataSourceMeta } = this.props; componentDidMount() {
let importedPlugin: GenericDataSourcePlugin; const { initDataSourceSettings, pageId } = this.props;
initDataSourceSettings(pageId);
try {
importedPlugin = await importDataSourcePlugin(dataSourceMeta);
} catch (e) {
console.log('Failed to import plugin module', e);
}
this.setState({ plugin: importedPlugin });
}
async componentDidMount() {
const { loadDataSource, pageId } = this.props;
if (isNaN(pageId)) {
this.setState({ loadError: 'Invalid ID' });
return;
}
try {
await loadDataSource(pageId);
if (!this.state.plugin) {
await this.loadPlugin();
}
} catch (err) {
this.setState({ loadError: err });
}
} }
onSubmit = async (evt: React.FormEvent<HTMLFormElement>) => { onSubmit = async (evt: React.FormEvent<HTMLFormElement>) => {
@@ -136,40 +106,9 @@ export class DataSourceSettingsPage extends PureComponent<Props, State> {
); );
} }
async testDataSource() { testDataSource() {
const dsApi = await getDatasourceSrv().get(this.props.dataSource.name); const { dataSource, testDataSource } = this.props;
testDataSource(dataSource.name);
if (!dsApi.testDatasource) {
return;
}
this.setState({ isTesting: true, testingMessage: 'Testing...', testingStatus: 'info' });
getBackendSrv().withNoBackendCache(async () => {
try {
const result = await dsApi.testDatasource();
this.setState({
isTesting: false,
testingStatus: result.status,
testingMessage: result.message,
});
} catch (err) {
let message = '';
if (err.statusText) {
message = 'HTTP Error ' + err.statusText;
} else {
message = err.message;
}
this.setState({
isTesting: false,
testingStatus: 'error',
testingMessage: message,
});
}
});
} }
get hasDataSource() { get hasDataSource() {
@@ -218,7 +157,7 @@ export class DataSourceSettingsPage extends PureComponent<Props, State> {
} }
renderConfigPageBody(page: string) { renderConfigPageBody(page: string) {
const { plugin } = this.state; const { plugin } = this.props;
if (!plugin || !plugin.configPages) { if (!plugin || !plugin.configPages) {
return null; // still loading return null; // still loading
} }
@@ -233,8 +172,7 @@ export class DataSourceSettingsPage extends PureComponent<Props, State> {
} }
renderSettings() { renderSettings() {
const { dataSourceMeta, setDataSourceName, setIsDefault, dataSource } = this.props; const { dataSourceMeta, setDataSourceName, setIsDefault, dataSource, testingStatus, plugin } = this.props;
const { testingMessage, testingStatus, plugin } = this.state;
return ( return (
<form onSubmit={this.onSubmit}> <form onSubmit={this.onSubmit}>
@@ -265,10 +203,10 @@ export class DataSourceSettingsPage extends PureComponent<Props, State> {
)} )}
<div className="gf-form-group"> <div className="gf-form-group">
{testingMessage && ( {testingStatus && testingStatus.message && (
<div className={`alert-${testingStatus} alert`} aria-label={e2e.pages.DataSource.selectors.alert}> <div className={`alert-${testingStatus.status} alert`} aria-label={e2e.pages.DataSource.selectors.alert}>
<div className="alert-icon"> <div className="alert-icon">
{testingStatus === 'error' ? ( {testingStatus.status === 'error' ? (
<i className="fa fa-exclamation-triangle" /> <i className="fa fa-exclamation-triangle" />
) : ( ) : (
<i className="fa fa-check" /> <i className="fa fa-check" />
@@ -276,7 +214,7 @@ export class DataSourceSettingsPage extends PureComponent<Props, State> {
</div> </div>
<div className="alert-body"> <div className="alert-body">
<div className="alert-title" aria-label={e2e.pages.DataSource.selectors.alertMessage}> <div className="alert-title" aria-label={e2e.pages.DataSource.selectors.alertMessage}>
{testingMessage} {testingStatus.message}
</div> </div>
</div> </div>
</div> </div>
@@ -294,8 +232,7 @@ export class DataSourceSettingsPage extends PureComponent<Props, State> {
} }
render() { render() {
const { navModel, page } = this.props; const { navModel, page, loadError } = this.props;
const { loadError } = this.state;
if (loadError) { if (loadError) {
return this.renderLoadError(loadError); return this.renderLoadError(loadError);
@@ -315,6 +252,7 @@ function mapStateToProps(state: StoreState) {
const pageId = getRouteParamsId(state.location); const pageId = getRouteParamsId(state.location);
const dataSource = getDataSource(state.dataSources, pageId); const dataSource = getDataSource(state.dataSources, pageId);
const page = state.location.query.page as string; const page = state.location.query.page as string;
const { plugin, loadError, testingStatus } = state.dataSourceSettings;
return { return {
navModel: getNavModel( navModel: getNavModel(
@@ -327,6 +265,9 @@ function mapStateToProps(state: StoreState) {
pageId: pageId, pageId: pageId,
query: state.location.query, query: state.location.query,
page, page,
plugin,
loadError,
testingStatus,
}; };
} }
@@ -337,6 +278,10 @@ const mapDispatchToProps = {
updateDataSource, updateDataSource,
setIsDefault, setIsDefault,
dataSourceLoaded, dataSourceLoaded,
initDataSourceSettings,
testDataSource,
}; };
export default hot(module)(connect(mapStateToProps, mapDispatchToProps)(DataSourceSettingsPage)); export default hot(module)(
connectWithCleanUp(mapStateToProps, mapDispatchToProps, state => state.dataSourceSettings)(DataSourceSettingsPage)
);

View File

@@ -1,5 +1,33 @@
import { findNewName, nameExits } from './actions'; import {
findNewName,
nameExits,
InitDataSourceSettingDependencies,
testDataSource,
TestDataSourceDependencies,
} from './actions';
import { getMockPlugin, getMockPlugins } from '../../plugins/__mocks__/pluginMocks'; import { getMockPlugin, getMockPlugins } from '../../plugins/__mocks__/pluginMocks';
import { thunkTester } from 'test/core/thunk/thunkTester';
import {
initDataSourceSettingsSucceeded,
initDataSourceSettingsFailed,
testDataSourceStarting,
testDataSourceSucceeded,
testDataSourceFailed,
} from './reducers';
import { initDataSourceSettings } from '../state/actions';
import { ThunkResult, ThunkDispatch } from 'app/types';
import { GenericDataSourcePlugin } from '../settings/PluginSettings';
const getBackendSrvMock = () =>
({
get: jest.fn().mockReturnValue({
testDatasource: jest.fn().mockReturnValue({
status: '',
message: '',
}),
}),
withNoBackendCache: jest.fn().mockImplementationOnce(cb => cb()),
} as any);
describe('Name exists', () => { describe('Name exists', () => {
const plugins = getMockPlugins(5); const plugins = getMockPlugins(5);
@@ -42,3 +70,131 @@ describe('Find new name', () => {
expect(findNewName(plugins, name)).toEqual('pretty cool plugin-'); expect(findNewName(plugins, name)).toEqual('pretty cool plugin-');
}); });
}); });
describe('initDataSourceSettings', () => {
describe('when pageId is not a number', () => {
it('then initDataSourceSettingsFailed should be dispatched', async () => {
const dispatchedActions = await thunkTester({})
.givenThunk(initDataSourceSettings)
.whenThunkIsDispatched('some page');
expect(dispatchedActions).toEqual([initDataSourceSettingsFailed(new Error('Invalid ID'))]);
});
});
describe('when pageId is a number', () => {
it('then initDataSourceSettingsSucceeded should be dispatched', async () => {
const thunkMock = (): ThunkResult<void> => (dispatch: ThunkDispatch, getState) => {};
const dataSource = { type: 'app' };
const dataSourceMeta = { id: 'some id' };
const dependencies: InitDataSourceSettingDependencies = {
loadDataSource: jest.fn(thunkMock),
getDataSource: jest.fn().mockReturnValue(dataSource),
getDataSourceMeta: jest.fn().mockReturnValue(dataSourceMeta),
importDataSourcePlugin: jest.fn().mockReturnValue({} as GenericDataSourcePlugin),
};
const state = {
dataSourceSettings: {},
dataSources: {},
};
const dispatchedActions = await thunkTester(state)
.givenThunk(initDataSourceSettings)
.whenThunkIsDispatched(256, dependencies);
expect(dispatchedActions).toEqual([initDataSourceSettingsSucceeded({} as GenericDataSourcePlugin)]);
expect(dependencies.loadDataSource).toHaveBeenCalledTimes(1);
expect(dependencies.loadDataSource).toHaveBeenCalledWith(256);
expect(dependencies.getDataSource).toHaveBeenCalledTimes(1);
expect(dependencies.getDataSource).toHaveBeenCalledWith({}, 256);
expect(dependencies.getDataSourceMeta).toHaveBeenCalledTimes(1);
expect(dependencies.getDataSourceMeta).toHaveBeenCalledWith({}, 'app');
expect(dependencies.importDataSourcePlugin).toHaveBeenCalledTimes(1);
expect(dependencies.importDataSourcePlugin).toHaveBeenCalledWith(dataSourceMeta);
});
});
describe('when plugin loading fails', () => {
it('then initDataSourceSettingsFailed should be dispatched', async () => {
const dependencies: InitDataSourceSettingDependencies = {
loadDataSource: jest.fn().mockImplementation(() => {
throw new Error('Error loading plugin');
}),
getDataSource: jest.fn(),
getDataSourceMeta: jest.fn(),
importDataSourcePlugin: jest.fn(),
};
const state = {
dataSourceSettings: {},
dataSources: {},
};
const dispatchedActions = await thunkTester(state)
.givenThunk(initDataSourceSettings)
.whenThunkIsDispatched(301, dependencies);
expect(dispatchedActions).toEqual([initDataSourceSettingsFailed(new Error('Error loading plugin'))]);
expect(dependencies.loadDataSource).toHaveBeenCalledTimes(1);
expect(dependencies.loadDataSource).toHaveBeenCalledWith(301);
});
});
});
describe('testDataSource', () => {
describe('when a datasource is tested', () => {
it('then testDataSourceStarting and testDataSourceSucceeded should be dispatched', async () => {
const dependencies: TestDataSourceDependencies = {
getDatasourceSrv: () =>
({
get: jest.fn().mockReturnValue({
testDatasource: jest.fn().mockReturnValue({
status: '',
message: '',
}),
}),
} as any),
getBackendSrv: getBackendSrvMock,
};
const state = {
testingStatus: {
status: '',
message: '',
},
};
const dispatchedActions = await thunkTester(state)
.givenThunk(testDataSource)
.whenThunkIsDispatched('Azure Monitor', dependencies);
expect(dispatchedActions).toEqual([testDataSourceStarting(), testDataSourceSucceeded(state.testingStatus)]);
});
it('then testDataSourceFailed should be dispatched', async () => {
const dependencies: TestDataSourceDependencies = {
getDatasourceSrv: () =>
({
get: jest.fn().mockReturnValue({
testDatasource: jest.fn().mockImplementation(() => {
throw new Error('Error testing datasource');
}),
}),
} as any),
getBackendSrv: getBackendSrvMock,
};
const result = {
message: 'Error testing datasource',
};
const state = {
testingStatus: {
message: '',
status: '',
},
};
const dispatchedActions = await thunkTester(state)
.givenThunk(testDataSource)
.whenThunkIsDispatched('Azure Monitor', dependencies);
expect(dispatchedActions).toEqual([testDataSourceStarting(), testDataSourceFailed(result)]);
});
});
});

View File

@@ -1,10 +1,10 @@
import config from '../../../core/config'; import config from '../../../core/config';
import { getBackendSrv } from '@grafana/runtime'; import { getBackendSrv } from 'app/core/services/backend_srv';
import { getDatasourceSrv } from 'app/features/plugins/datasource_srv'; import { getDatasourceSrv } from 'app/features/plugins/datasource_srv';
import { updateLocation, updateNavIndex } from 'app/core/actions'; import { updateLocation, updateNavIndex } from 'app/core/actions';
import { buildNavModel } from './navModel'; import { buildNavModel } from './navModel';
import { DataSourcePluginMeta, DataSourceSettings } from '@grafana/data'; import { DataSourcePluginMeta, DataSourceSettings } from '@grafana/data';
import { DataSourcePluginCategory, ThunkResult } from 'app/types'; import { DataSourcePluginCategory, ThunkResult, ThunkDispatch } from 'app/types';
import { getPluginSettings } from 'app/features/plugins/PluginSettingsCache'; import { getPluginSettings } from 'app/features/plugins/PluginSettingsCache';
import { importDataSourcePlugin } from 'app/features/plugins/plugin_loader'; import { importDataSourcePlugin } from 'app/features/plugins/plugin_loader';
import { import {
@@ -13,14 +13,102 @@ import {
dataSourcePluginsLoad, dataSourcePluginsLoad,
dataSourcePluginsLoaded, dataSourcePluginsLoaded,
dataSourcesLoaded, dataSourcesLoaded,
initDataSourceSettingsFailed,
initDataSourceSettingsSucceeded,
testDataSourceStarting,
testDataSourceSucceeded,
testDataSourceFailed,
} from './reducers'; } from './reducers';
import { buildCategories } from './buildCategories'; import { buildCategories } from './buildCategories';
import { getDataSource, getDataSourceMeta } from './selectors';
import { getDataSourceSrv } from '@grafana/runtime';
export interface DataSourceTypesLoadedPayload { export interface DataSourceTypesLoadedPayload {
plugins: DataSourcePluginMeta[]; plugins: DataSourcePluginMeta[];
categories: DataSourcePluginCategory[]; categories: DataSourcePluginCategory[];
} }
export interface InitDataSourceSettingDependencies {
loadDataSource: typeof loadDataSource;
getDataSource: typeof getDataSource;
getDataSourceMeta: typeof getDataSourceMeta;
importDataSourcePlugin: typeof importDataSourcePlugin;
}
export interface TestDataSourceDependencies {
getDatasourceSrv: typeof getDataSourceSrv;
getBackendSrv: typeof getBackendSrv;
}
export const initDataSourceSettings = (
pageId: number,
dependencies: InitDataSourceSettingDependencies = {
loadDataSource,
getDataSource,
getDataSourceMeta,
importDataSourcePlugin,
}
): ThunkResult<void> => {
return async (dispatch: ThunkDispatch, getState) => {
if (isNaN(pageId)) {
dispatch(initDataSourceSettingsFailed(new Error('Invalid ID')));
return;
}
try {
await dispatch(dependencies.loadDataSource(pageId));
if (getState().dataSourceSettings.plugin) {
return;
}
const dataSource = dependencies.getDataSource(getState().dataSources, pageId);
const dataSourceMeta = dependencies.getDataSourceMeta(getState().dataSources, dataSource.type);
const importedPlugin = await dependencies.importDataSourcePlugin(dataSourceMeta);
dispatch(initDataSourceSettingsSucceeded(importedPlugin));
} catch (err) {
console.log('Failed to import plugin module', err);
dispatch(initDataSourceSettingsFailed(err));
}
};
};
export const testDataSource = (
dataSourceName: string,
dependencies: TestDataSourceDependencies = {
getDatasourceSrv,
getBackendSrv,
}
): ThunkResult<void> => {
return async (dispatch: ThunkDispatch, getState) => {
const dsApi = await dependencies.getDatasourceSrv().get(dataSourceName);
if (!dsApi.testDatasource) {
return;
}
dispatch(testDataSourceStarting());
dependencies.getBackendSrv().withNoBackendCache(async () => {
try {
const result = await dsApi.testDatasource();
dispatch(testDataSourceSucceeded(result));
} catch (err) {
let message = '';
if (err.statusText) {
message = 'HTTP Error ' + err.statusText;
} else {
message = err.message;
}
dispatch(testDataSourceFailed({ message }));
}
});
};
};
export function loadDataSources(): ThunkResult<void> { export function loadDataSources(): ThunkResult<void> {
return async dispatch => { return async dispatch => {
const response = await getBackendSrv().get('/api/datasources'); const response = await getBackendSrv().get('/api/datasources');
@@ -123,7 +211,7 @@ export function findNewName(dataSources: ItemWithName[], name: string) {
function updateFrontendSettings() { function updateFrontendSettings() {
return getBackendSrv() return getBackendSrv()
.get('/api/frontend/settings') .get('/api/frontend/settings')
.then(settings => { .then((settings: any) => {
config.datasources = settings.datasources; config.datasources = settings.datasources;
config.defaultDatasource = settings.defaultDatasource; config.defaultDatasource = settings.defaultDatasource;
getDatasourceSrv().init(); getDatasourceSrv().init();

View File

@@ -12,11 +12,16 @@ import {
setDataSourcesSearchQuery, setDataSourcesSearchQuery,
setDataSourceTypeSearchQuery, setDataSourceTypeSearchQuery,
setIsDefault, setIsDefault,
dataSourceSettingsReducer,
initialDataSourceSettingsState,
initDataSourceSettingsSucceeded,
initDataSourceSettingsFailed,
} from './reducers'; } from './reducers';
import { getMockDataSource, getMockDataSources } from '../__mocks__/dataSourcesMocks'; import { getMockDataSource, getMockDataSources } from '../__mocks__/dataSourcesMocks';
import { LayoutModes } from 'app/core/components/LayoutSelector/LayoutSelector'; import { LayoutModes } from 'app/core/components/LayoutSelector/LayoutSelector';
import { DataSourcesState } from 'app/types'; import { DataSourcesState, DataSourceSettingsState } from 'app/types';
import { PluginMeta, PluginMetaInfo, PluginType } from '@grafana/data'; import { PluginMeta, PluginMetaInfo, PluginType } from '@grafana/data';
import { GenericDataSourcePlugin } from '../settings/PluginSettings';
const mockPlugin = () => const mockPlugin = () =>
({ ({
@@ -136,3 +141,34 @@ describe('dataSourcesReducer', () => {
}); });
}); });
}); });
describe('dataSourceSettingsReducer', () => {
describe('when initDataSourceSettingsSucceeded is dispatched', () => {
it('then state should be correct', () => {
reducerTester<DataSourceSettingsState>()
.givenReducer(dataSourceSettingsReducer, { ...initialDataSourceSettingsState })
.whenActionIsDispatched(initDataSourceSettingsSucceeded({} as GenericDataSourcePlugin))
.thenStateShouldEqual({
...initialDataSourceSettingsState,
plugin: {} as GenericDataSourcePlugin,
loadError: null,
});
});
});
describe('when initDataSourceSettingsFailed is dispatched', () => {
it('then state should be correct', () => {
reducerTester<DataSourceSettingsState>()
.givenReducer(dataSourceSettingsReducer, {
...initialDataSourceSettingsState,
plugin: {} as GenericDataSourcePlugin,
})
.whenActionIsDispatched(initDataSourceSettingsFailed(new Error('Some error')))
.thenStatePredicateShouldEqual(resultingState => {
expect(resultingState.plugin).toEqual(null);
expect(resultingState.loadError).toEqual('Some error');
return true;
});
});
});
});

View File

@@ -1,9 +1,10 @@
import { AnyAction, createAction } from '@reduxjs/toolkit'; import { AnyAction, createAction } from '@reduxjs/toolkit';
import { DataSourcePluginMeta, DataSourceSettings } from '@grafana/data'; import { DataSourcePluginMeta, DataSourceSettings } from '@grafana/data';
import { DataSourcesState } from 'app/types'; import { DataSourcesState, DataSourceSettingsState } from 'app/types';
import { LayoutMode, LayoutModes } from 'app/core/components/LayoutSelector/LayoutSelector'; import { LayoutMode, LayoutModes } from 'app/core/components/LayoutSelector/LayoutSelector';
import { DataSourceTypesLoadedPayload } from './actions'; import { DataSourceTypesLoadedPayload } from './actions';
import { GenericDataSourcePlugin } from '../settings/PluginSettings';
export const initialState: DataSourcesState = { export const initialState: DataSourcesState = {
dataSources: [], dataSources: [],
@@ -94,6 +95,76 @@ export const dataSourcesReducer = (state: DataSourcesState = initialState, actio
return state; return state;
}; };
export const initialDataSourceSettingsState: DataSourceSettingsState = {
testingStatus: {
status: null,
message: null,
},
loadError: null,
plugin: null,
};
export const initDataSourceSettingsSucceeded = createAction<GenericDataSourcePlugin>(
'dataSourceSettings/initDataSourceSettingsSucceeded'
);
export const initDataSourceSettingsFailed = createAction<Error>('dataSourceSettings/initDataSourceSettingsFailed');
export const testDataSourceStarting = createAction<undefined>('dataSourceSettings/testDataSourceStarting');
export const testDataSourceSucceeded = createAction<{
status: string;
message: string;
}>('dataSourceSettings/testDataSourceSucceeded');
export const testDataSourceFailed = createAction<{ message: string }>('dataSourceSettings/testDataSourceFailed');
export const dataSourceSettingsReducer = (
state: DataSourceSettingsState = initialDataSourceSettingsState,
action: AnyAction
): DataSourceSettingsState => {
if (initDataSourceSettingsSucceeded.match(action)) {
return { ...state, plugin: action.payload, loadError: null };
}
if (initDataSourceSettingsFailed.match(action)) {
return { ...state, plugin: null, loadError: action.payload.message };
}
if (testDataSourceStarting.match(action)) {
return {
...state,
testingStatus: {
message: 'Testing...',
status: 'info',
},
};
}
if (testDataSourceSucceeded.match(action)) {
return {
...state,
testingStatus: {
status: action.payload.status,
message: action.payload.message,
},
};
}
if (testDataSourceFailed.match(action)) {
return {
...state,
testingStatus: {
status: 'error',
message: action.payload.message,
},
};
}
return state;
};
export default { export default {
dataSources: dataSourcesReducer, dataSources: dataSourcesReducer,
dataSourceSettings: dataSourceSettingsReducer,
}; };

View File

@@ -70,7 +70,6 @@ const makeLog = (overides: Partial<LogRowModel>): LogRowModel => {
hasAnsi: false, hasAnsi: false,
labels: {}, labels: {},
raw: entry, raw: entry,
timestamp: '',
timeFromNow: '', timeFromNow: '',
timeEpochMs: 1, timeEpochMs: 1,
timeLocal: '', timeLocal: '',

View File

@@ -187,7 +187,6 @@ describe('ResultProcessor', () => {
timeFromNow: 'fromNow() jest mocked', timeFromNow: 'fromNow() jest mocked',
timeLocal: 'format() jest mocked', timeLocal: 'format() jest mocked',
timeUtc: 'format() jest mocked', timeUtc: 'format() jest mocked',
timestamp: 300,
uid: '2', uid: '2',
uniqueLabels: {}, uniqueLabels: {},
}, },
@@ -205,7 +204,6 @@ describe('ResultProcessor', () => {
timeFromNow: 'fromNow() jest mocked', timeFromNow: 'fromNow() jest mocked',
timeLocal: 'format() jest mocked', timeLocal: 'format() jest mocked',
timeUtc: 'format() jest mocked', timeUtc: 'format() jest mocked',
timestamp: 200,
uid: '1', uid: '1',
uniqueLabels: {}, uniqueLabels: {},
}, },
@@ -223,7 +221,6 @@ describe('ResultProcessor', () => {
timeFromNow: 'fromNow() jest mocked', timeFromNow: 'fromNow() jest mocked',
timeLocal: 'format() jest mocked', timeLocal: 'format() jest mocked',
timeUtc: 'format() jest mocked', timeUtc: 'format() jest mocked',
timestamp: 100,
uid: '0', uid: '0',
uniqueLabels: {}, uniqueLabels: {},
}, },

View File

@@ -1,4 +1,4 @@
import angular from 'angular'; import angular, { ILocationService } from 'angular';
import _ from 'lodash'; import _ from 'lodash';
import config from 'app/core/config'; import config from 'app/core/config';
@@ -16,7 +16,8 @@ function pluginDirectiveLoader(
$rootScope: GrafanaRootScope, $rootScope: GrafanaRootScope,
$http: any, $http: any,
$templateCache: any, $templateCache: any,
$timeout: any $timeout: any,
$location: ILocationService
) { ) {
function getTemplate(component: { template: any; templateUrl: any }) { function getTemplate(component: { template: any; templateUrl: any }) {
if (component.template) { if (component.template) {
@@ -126,10 +127,13 @@ function pluginDirectiveLoader(
} }
// Annotations // Annotations
case 'annotations-query-ctrl': { case 'annotations-query-ctrl': {
const baseUrl = scope.ctrl.currentDatasource.meta.baseUrl;
const pluginId = scope.ctrl.currentDatasource.meta.id;
return importDataSourcePlugin(scope.ctrl.currentDatasource.meta).then(dsPlugin => { return importDataSourcePlugin(scope.ctrl.currentDatasource.meta).then(dsPlugin => {
return { return {
baseUrl: scope.ctrl.currentDatasource.meta.baseUrl, baseUrl,
name: 'annotations-query-ctrl-' + scope.ctrl.currentDatasource.meta.id, name: 'annotations-query-ctrl-' + pluginId,
bindings: { annotation: '=', datasource: '=' }, bindings: { annotation: '=', datasource: '=' },
attrs: { attrs: {
annotation: 'ctrl.currentAnnotation', annotation: 'ctrl.currentAnnotation',
@@ -142,11 +146,19 @@ function pluginDirectiveLoader(
// Datasource ConfigCtrl // Datasource ConfigCtrl
case 'datasource-config-ctrl': { case 'datasource-config-ctrl': {
const dsMeta = scope.ctrl.datasourceMeta; const dsMeta = scope.ctrl.datasourceMeta;
const angularUrl = $location.url();
return importDataSourcePlugin(dsMeta).then(dsPlugin => { return importDataSourcePlugin(dsMeta).then(dsPlugin => {
scope.$watch( scope.$watch(
'ctrl.current', 'ctrl.current',
() => { () => {
// This watcher can trigger when we navigate away due to late digests
// This check is to stop onModelChanged from being called when navigating away
// as it triggers a redux action which comes before the angular $routeChangeSucces and
// This makes the bridgeSrv think location changed from redux before detecting it was actually
// changed from angular.
if (angularUrl === $location.url()) {
scope.onModelChanged(scope.ctrl.current); scope.onModelChanged(scope.ctrl.current);
}
}, },
true true
); );

View File

@@ -6,6 +6,7 @@ import { PluginMeta } from '@grafana/data';
import { NavModelSrv } from 'app/core/core'; import { NavModelSrv } from 'app/core/core';
import { GrafanaRootScope } from 'app/routes/GrafanaCtrl'; import { GrafanaRootScope } from 'app/routes/GrafanaCtrl';
import { AppEvents } from '@grafana/data'; import { AppEvents } from '@grafana/data';
import { promiseToDigest } from '../../core/utils/promiseToDigest';
export class AppPageCtrl { export class AppPageCtrl {
page: any; page: any;
@@ -17,6 +18,7 @@ export class AppPageCtrl {
constructor(private $routeParams: any, private $rootScope: GrafanaRootScope, private navModelSrv: NavModelSrv) { constructor(private $routeParams: any, private $rootScope: GrafanaRootScope, private navModelSrv: NavModelSrv) {
this.pluginId = $routeParams.pluginId; this.pluginId = $routeParams.pluginId;
promiseToDigest($rootScope)(
Promise.resolve(getPluginSettings(this.pluginId)) Promise.resolve(getPluginSettings(this.pluginId))
.then(settings => { .then(settings => {
this.initPage(settings); this.initPage(settings);
@@ -24,7 +26,8 @@ export class AppPageCtrl {
.catch(err => { .catch(err => {
this.$rootScope.appEvent(AppEvents.alertError, ['Unknown Plugin']); this.$rootScope.appEvent(AppEvents.alertError, ['Unknown Plugin']);
this.navModel = this.navModelSrv.getNotFoundNav(); this.navModel = this.navModelSrv.getNotFoundNav();
}); })
);
} }
initPage(app: PluginMeta) { initPage(app: PluginMeta) {

View File

@@ -31,6 +31,14 @@ describe('templateSrv', () => {
expect(target).toBe('Server1 nested'); expect(target).toBe('Server1 nested');
}); });
it('built in vars should support objects', () => {
_templateSrv.setGlobalVariable('__dashboard', {
value: { name: 'hello' },
});
const target = _templateSrv.replace('${__dashboard.name}');
expect(target).toBe('hello');
});
it('scoped vars should support objects with propert names with dot', () => { it('scoped vars should support objects with propert names with dot', () => {
const target = _templateSrv.replace('${series.name} ${series.nested["field.with.dot"]}', { const target = _templateSrv.replace('${series.name} ${series.nested["field.with.dot"]}', {
series: { value: { name: 'Server1', nested: { 'field.with.dot': 'nested' } } }, series: { value: { name: 'Server1', nested: { 'field.with.dot': 'nested' } } },

View File

@@ -195,6 +195,15 @@ export class TemplateSrv {
this.grafanaVariables[name] = value; this.grafanaVariables[name] = value;
} }
setGlobalVariable(name: string, variable: any) {
this.index = {
...this.index,
[name]: {
current: variable,
},
};
}
getVariableName(expression: string) { getVariableName(expression: string) {
this.regex.lastIndex = 0; this.regex.lastIndex = 0;
const match = this.regex.exec(expression); const match = this.regex.exec(expression);
@@ -295,6 +304,15 @@ export class TemplateSrv {
} }
} }
if (fieldPath) {
const fieldValue = this.getVariableValue(variableName, fieldPath, {
[variableName]: { value: value, text: '' },
});
if (fieldValue !== null && fieldValue !== undefined) {
return this.formatValue(fieldValue, fmt, variable);
}
}
const res = this.formatValue(value, fmt, variable); const res = this.formatValue(value, fmt, variable);
return res; return res;
}); });

View File

@@ -11,9 +11,10 @@ import { TimeSrv } from 'app/features/dashboard/services/TimeSrv';
import { DashboardModel } from 'app/features/dashboard/state/DashboardModel'; import { DashboardModel } from 'app/features/dashboard/state/DashboardModel';
// Types // Types
import { TimeRange } from '@grafana/data'; import { TimeRange, AppEvents } from '@grafana/data';
import { CoreEvents } from 'app/types'; import { CoreEvents } from 'app/types';
import { UrlQueryMap } from '@grafana/runtime'; import { UrlQueryMap } from '@grafana/runtime';
import { appEvents } from 'app/core/core';
export class VariableSrv { export class VariableSrv {
dashboard: DashboardModel; dashboard: DashboardModel;
@@ -71,8 +72,13 @@ export class VariableSrv {
}); });
}); });
return this.$q.all(promises).then(() => { return this.$q
.all(promises)
.then(() => {
this.dashboard.startRefresh(); this.dashboard.startRefresh();
})
.catch(e => {
appEvents.emit(AppEvents.alertError, ['Template variable service failed', e.message]);
}); });
} }

View File

@@ -94,7 +94,7 @@ export function QueryFieldsEditor({
placeholder="Select region" placeholder="Select region"
options={regions} options={regions}
allowCustomValue allowCustomValue
onChange={({ value: region }) => onChange({ ...query, region })} onChange={({ value: region }) => onQueryChange({ ...query, region })}
/> />
</QueryInlineField> </QueryInlineField>
@@ -106,7 +106,7 @@ export function QueryFieldsEditor({
placeholder="Select namespace" placeholder="Select namespace"
allowCustomValue allowCustomValue
options={namespaces} options={namespaces}
onChange={({ value: namespace }) => onChange({ ...query, namespace })} onChange={({ value: namespace }) => onQueryChange({ ...query, namespace })}
/> />
</QueryInlineField> </QueryInlineField>
@@ -116,7 +116,7 @@ export function QueryFieldsEditor({
placeholder="Select metric name" placeholder="Select metric name"
allowCustomValue allowCustomValue
loadOptions={loadMetricNames} loadOptions={loadMetricNames}
onChange={({ value: metricName }) => onChange({ ...query, metricName })} onChange={({ value: metricName }) => onQueryChange({ ...query, metricName })}
/> />
</QueryInlineField> </QueryInlineField>

View File

@@ -6,6 +6,7 @@ import {
DataQueryRequest, DataQueryRequest,
DataQueryResponse, DataQueryResponse,
DataFrame, DataFrame,
ScopedVars,
} from '@grafana/data'; } from '@grafana/data';
import { ElasticResponse } from './elastic_response'; import { ElasticResponse } from './elastic_response';
import { IndexPattern } from './index_pattern'; import { IndexPattern } from './index_pattern';
@@ -264,14 +265,14 @@ export class ElasticDatasource extends DataSourceApi<ElasticsearchQuery, Elastic
}); });
} }
interpolateVariablesInQueries(queries: ElasticsearchQuery[]): ElasticsearchQuery[] { interpolateVariablesInQueries(queries: ElasticsearchQuery[], scopedVars: ScopedVars): ElasticsearchQuery[] {
let expandedQueries = queries; let expandedQueries = queries;
if (queries && queries.length > 0) { if (queries && queries.length > 0) {
expandedQueries = queries.map(query => { expandedQueries = queries.map(query => {
const expandedQuery = { const expandedQuery = {
...query, ...query,
datasource: this.name, datasource: this.name,
query: this.templateSrv.replace(query.query, {}, 'lucene'), query: this.templateSrv.replace(query.query, scopedVars, 'lucene'),
}; };
return expandedQuery; return expandedQuery;
}); });
@@ -344,7 +345,7 @@ export class ElasticDatasource extends DataSourceApi<ElasticsearchQuery, Elastic
target.metrics = [queryDef.defaultMetricAgg()]; target.metrics = [queryDef.defaultMetricAgg()];
// Setting this for metrics queries that are typed as logs // Setting this for metrics queries that are typed as logs
target.isLogsQuery = true; target.isLogsQuery = true;
queryObj = this.queryBuilder.getLogsQuery(target, queryString); queryObj = this.queryBuilder.getLogsQuery(target, adhocFilters, queryString);
} else { } else {
if (target.alias) { if (target.alias) {
target.alias = this.templateSrv.replace(target.alias, options.scopedVars, 'lucene'); target.alias = this.templateSrv.replace(target.alias, options.scopedVars, 'lucene');

View File

@@ -379,7 +379,7 @@ export class ElasticQueryBuilder {
return query; return query;
} }
getLogsQuery(target: any, querystring: string) { getLogsQuery(target: any, adhocFilters?: any, querystring?: string) {
let query: any = { let query: any = {
size: 0, size: 0,
query: { query: {
@@ -389,6 +389,8 @@ export class ElasticQueryBuilder {
}, },
}; };
this.addAdhocFilters(query, adhocFilters);
if (target.query) { if (target.query) {
query.query.bool.filter.push({ query.query.bool.filter.push({
query_string: { query_string: {

View File

@@ -476,7 +476,6 @@ describe('ElasticQueryBuilder', () => {
it('should set correct explicit sorting', () => { it('should set correct explicit sorting', () => {
const order = testGetTermsQuery({ order: 'desc' }); const order = testGetTermsQuery({ order: 'desc' });
console.log({ order });
checkSort(order, 'desc'); checkSort(order, 'desc');
expect(order._count).toBeUndefined(); expect(order._count).toBeUndefined();
}); });
@@ -496,11 +495,68 @@ describe('ElasticQueryBuilder', () => {
}); });
}); });
it('getTermsQuery should request documents and date histogram', () => { describe('getLogsQuery', () => {
const query = builder.getLogsQuery({}, ''); it('should return query with defaults', () => {
console.log({ query }); const query = builder.getLogsQuery({}, null, '*');
expect(query).toHaveProperty('query.bool.filter');
expect(query.aggs['2']).toHaveProperty('date_histogram'); expect(query.size).toEqual(500);
const expectedQuery = {
bool: {
filter: [{ range: { '@timestamp': { gte: '$timeFrom', lte: '$timeTo', format: 'epoch_millis' } } }],
},
};
expect(query.query).toEqual(expectedQuery);
expect(query.sort).toEqual({ '@timestamp': { order: 'desc', unmapped_type: 'boolean' } });
const expectedAggs = {
2: {
aggs: {},
date_histogram: {
extended_bounds: { max: '$timeTo', min: '$timeFrom' },
field: '@timestamp',
format: 'epoch_millis',
interval: '$__interval',
min_doc_count: 0,
},
},
};
expect(query.aggs).toMatchObject(expectedAggs);
});
it('with querystring', () => {
const query = builder.getLogsQuery({ query: 'foo' }, null, 'foo');
const expectedQuery = {
bool: {
filter: [
{ range: { '@timestamp': { gte: '$timeFrom', lte: '$timeTo', format: 'epoch_millis' } } },
{ query_string: { analyze_wildcard: true, query: 'foo' } },
],
},
};
expect(query.query).toEqual(expectedQuery);
});
it('with adhoc filters', () => {
const adhocFilters = [
{ key: 'key1', operator: '=', value: 'value1' },
{ key: 'key2', operator: '!=', value: 'value2' },
{ key: 'key3', operator: '<', value: 'value3' },
{ key: 'key4', operator: '>', value: 'value4' },
{ key: 'key5', operator: '=~', value: 'value5' },
{ key: 'key6', operator: '!~', value: 'value6' },
];
const query = builder.getLogsQuery({}, adhocFilters, '*');
expect(query.query.bool.must[0].match_phrase['key1'].query).toBe('value1');
expect(query.query.bool.must_not[0].match_phrase['key2'].query).toBe('value2');
expect(query.query.bool.filter[1].range['key3'].lt).toBe('value3');
expect(query.query.bool.filter[2].range['key4'].gt).toBe('value4');
expect(query.query.bool.filter[3].regexp['key5']).toBe('value5');
expect(query.query.bool.filter[4].bool.must_not.regexp['key6']).toBe('value6');
});
}); });
}); });
}); });

View File

@@ -298,7 +298,12 @@ export class ConfigEditor extends PureComponent<Props, State> {
onLoadWorkspaces={this.getWorkspaces} onLoadWorkspaces={this.getWorkspaces}
/> />
<InsightsConfig options={options} onUpdateOption={this.updateOption} onResetOptionKey={this.resetKey} /> <InsightsConfig
options={options}
onUpdateOption={this.updateOption}
onUpdateSecureOption={this.updateSecureOption}
onResetOptionKey={this.resetKey}
/>
</> </>
); );
} }

View File

@@ -35,6 +35,7 @@ const setup = (propOverrides?: object) => {
readOnly: false, readOnly: false,
}, },
onUpdateOption: jest.fn(), onUpdateOption: jest.fn(),
onUpdateSecureOption: jest.fn(),
onResetOptionKey: jest.fn(), onResetOptionKey: jest.fn(),
}; };

View File

@@ -4,16 +4,17 @@ import { AzureDataSourceSettings } from '../types';
export interface Props { export interface Props {
options: AzureDataSourceSettings; options: AzureDataSourceSettings;
onUpdateOption: (key: string, val: any, secure: boolean) => void; onUpdateOption: (key: string, val: any) => void;
onUpdateSecureOption: (key: string, val: any) => void;
onResetOptionKey: (key: string) => void; onResetOptionKey: (key: string) => void;
} }
export class InsightsConfig extends PureComponent<Props> { export class InsightsConfig extends PureComponent<Props> {
onAppInsightsAppIdChange = (event: ChangeEvent<HTMLInputElement>) => { onAppInsightsAppIdChange = (event: ChangeEvent<HTMLInputElement>) => {
this.props.onUpdateOption('appInsightsAppId', event.target.value, false); this.props.onUpdateOption('appInsightsAppId', event.target.value);
}; };
onAppInsightsApiKeyChange = (event: ChangeEvent<HTMLInputElement>) => { onAppInsightsApiKeyChange = (event: ChangeEvent<HTMLInputElement>) => {
this.props.onUpdateOption('appInsightsApiKey', event.target.value, true); this.props.onUpdateSecureOption('appInsightsApiKey', event.target.value);
}; };
onAppInsightsResetApiKey = () => { onAppInsightsResetApiKey = () => {

View File

@@ -80,6 +80,7 @@ exports[`Render should render component 1`] = `
<InsightsConfig <InsightsConfig
onResetOptionKey={[Function]} onResetOptionKey={[Function]}
onUpdateOption={[Function]} onUpdateOption={[Function]}
onUpdateSecureOption={[Function]}
options={ options={
Object { Object {
"access": "proxy", "access": "proxy",

View File

@@ -148,14 +148,14 @@ export class GraphiteDatasource extends DataSourceApi<GraphiteQuery, GraphiteOpt
return tags; return tags;
} }
interpolateVariablesInQueries(queries: GraphiteQuery[]): GraphiteQuery[] { interpolateVariablesInQueries(queries: GraphiteQuery[], scopedVars: ScopedVars): GraphiteQuery[] {
let expandedQueries = queries; let expandedQueries = queries;
if (queries && queries.length > 0) { if (queries && queries.length > 0) {
expandedQueries = queries.map(query => { expandedQueries = queries.map(query => {
const expandedQuery = { const expandedQuery = {
...query, ...query,
datasource: this.name, datasource: this.name,
target: this.templateSrv.replace(query.target), target: this.templateSrv.replace(query.target, scopedVars),
}; };
return expandedQuery; return expandedQuery;
}); });

View File

@@ -184,24 +184,27 @@ export function graphiteFuncEditor($compile: any, templateSrv: TemplateSrv) {
} }
let paramValue = templateSrv.highlightVariablesAsHtml(func.params[index]); let paramValue = templateSrv.highlightVariablesAsHtml(func.params[index]);
const hasValue = paramValue !== null && paramValue !== undefined; const hasValue = paramValue !== null && paramValue !== undefined && paramValue !== '';
const last = index >= func.params.length - 1 && param.optional && !hasValue; const last = index >= func.params.length - 1 && param.optional && !hasValue;
let linkClass = 'query-part__link';
if (last) {
linkClass += ' query-part__last';
}
if (last && param.multiple) { if (last && param.multiple) {
paramValue = '+'; paramValue = '+';
} else if (!hasValue) {
// for params with no value default to param name
paramValue = param.name;
linkClass += ' query-part__link--no-value';
} }
if (index > 0) { if (index > 0) {
$('<span class="comma' + (last ? ' query-part__last' : '') + '">, </span>').appendTo(elem); $('<span class="comma' + (last ? ' query-part__last' : '') + '">, </span>').appendTo(elem);
} }
const $paramLink = $( const $paramLink = $(`<a ng-click="" class="${linkClass}">${paramValue}</a>`);
'<a ng-click="" class="graphite-func-param-link' +
(last ? ' query-part__last' : '') +
'">' +
(hasValue ? paramValue : '&nbsp;') +
'</a>'
);
const $input = $(paramTemplate); const $input = $(paramTemplate);
$input.attr('placeholder', param.name); $input.attr('placeholder', param.name);
@@ -232,7 +235,7 @@ export function graphiteFuncEditor($compile: any, templateSrv: TemplateSrv) {
$scope.func.added = false; $scope.func.added = false;
setTimeout(() => { setTimeout(() => {
elem elem
.find('.graphite-func-param-link') .find('.query-part__link')
.first() .first()
.click(); .click();
}, 10); }, 10);

View File

@@ -1,4 +1,10 @@
import { pairsAreValid } from './InfluxLogsQueryField'; import React from 'react';
import { mount } from 'enzyme';
import { InfluxLogsQueryField, pairsAreValid } from './InfluxLogsQueryField';
import { InfluxDatasourceMock } from '../datasource.mock';
import InfluxDatasource from '../datasource';
import { InfluxQuery } from '../types';
import { ButtonCascader } from '@grafana/ui';
describe('pairsAreValid()', () => { describe('pairsAreValid()', () => {
describe('when all pairs are fully defined', () => { describe('when all pairs are fully defined', () => {
@@ -51,3 +57,43 @@ describe('pairsAreValid()', () => {
}); });
}); });
}); });
describe('InfluxLogsQueryField', () => {
it('should load and show correct measurements and fields in cascader', async () => {
const wrapper = getInfluxLogsQueryField();
// Looks strange but we do async stuff in didMount and this will push the stack at the end of eval loop, effectively
// waiting for the didMount to finish.
await Promise.resolve();
wrapper.update();
const cascader = wrapper.find(ButtonCascader);
expect(cascader.prop('options')).toEqual([
{ label: 'logs', value: 'logs', children: [{ label: 'description', value: 'description', children: [] }] },
]);
});
});
function getInfluxLogsQueryField(props?: any) {
const datasource: InfluxDatasource = new InfluxDatasourceMock(
props?.measurements || {
logs: [{ name: 'description', type: 'string' }],
}
) as any;
const defaultProps = {
datasource,
history: [] as any[],
onRunQuery: () => {},
onChange: (query: InfluxQuery) => {},
query: {
refId: '',
} as InfluxQuery,
};
return mount(
<InfluxLogsQueryField
{...{
...defaultProps,
...props,
}}
/>
);
}

View File

@@ -75,7 +75,7 @@ export class InfluxLogsQueryField extends React.PureComponent<Props, State> {
measurements.push({ measurements.push({
label: measurementObj.text, label: measurementObj.text,
value: measurementObj.text, value: measurementObj.text,
items: fields, children: fields,
}); });
} }
this.setState({ measurements }); this.setState({ measurements });

View File

@@ -0,0 +1,54 @@
type FieldsDefinition = {
name: string;
// String type, usually something like 'string' or 'float'.
type: string;
};
type Measurements = { [measurement: string]: FieldsDefinition[] };
type FieldReturnValue = { text: string };
/**
* Datasource mock for influx. At the moment this only works for queries that should return measurements or their
* fields and no other functionality is implemented.
*/
export class InfluxDatasourceMock {
constructor(private measurements: Measurements) {}
metricFindQuery(query: string) {
if (isMeasurementsQuery(query)) {
return this.getMeasurements();
} else {
return this.getMeasurementFields(query);
}
}
private getMeasurements(): FieldReturnValue[] {
return Object.keys(this.measurements).map(key => ({ text: key }));
}
private getMeasurementFields(query: string): FieldReturnValue[] {
const match = query.match(/SHOW FIELD KEYS FROM \"(.+)\"/);
if (!match) {
throw new Error(`Failed to match query="${query}"`);
}
const measurementName = match[1];
if (!measurementName) {
throw new Error(`Failed to match measurement name from query="${query}"`);
}
const fields = this.measurements[measurementName];
if (!fields) {
throw new Error(
`Failed to find measurement with name="${measurementName}" in measurements="[${Object.keys(
this.measurements
).join(', ')}]"`
);
}
return fields.map(field => ({
text: field.name,
}));
}
}
function isMeasurementsQuery(query: string) {
return /SHOW MEASUREMENTS/.test(query);
}

View File

@@ -1,6 +1,6 @@
import _ from 'lodash'; import _ from 'lodash';
import { dateMath, DataSourceApi, DataSourceInstanceSettings } from '@grafana/data'; import { dateMath, DataSourceApi, DataSourceInstanceSettings, ScopedVars } from '@grafana/data';
import InfluxSeries from './influx_series'; import InfluxSeries from './influx_series';
import InfluxQueryModel from './influx_query_model'; import InfluxQueryModel from './influx_query_model';
import ResponseParser from './response_parser'; import ResponseParser from './response_parser';
@@ -171,7 +171,7 @@ export default class InfluxDatasource extends DataSourceApi<InfluxQuery, InfluxO
return false; return false;
} }
interpolateVariablesInQueries(queries: InfluxQuery[]): InfluxQuery[] { interpolateVariablesInQueries(queries: InfluxQuery[], scopedVars: ScopedVars): InfluxQuery[] {
if (!queries || queries.length === 0) { if (!queries || queries.length === 0) {
return []; return [];
} }
@@ -182,11 +182,11 @@ export default class InfluxDatasource extends DataSourceApi<InfluxQuery, InfluxO
const expandedQuery = { const expandedQuery = {
...query, ...query,
datasource: this.name, datasource: this.name,
measurement: this.templateSrv.replace(query.measurement, null, 'regex'), measurement: this.templateSrv.replace(query.measurement, scopedVars, 'regex'),
}; };
if (query.rawQuery) { if (query.rawQuery) {
expandedQuery.query = this.templateSrv.replace(query.query, null, 'regex'); expandedQuery.query = this.templateSrv.replace(query.query, scopedVars, 'regex');
} }
if (query.tags) { if (query.tags) {

View File

@@ -1,8 +1,8 @@
import LokiDatasource, { RangeQueryOptions } from './datasource'; import LokiDatasource, { RangeQueryOptions } from './datasource';
import { LokiQuery, LokiResultType, LokiResponse, LokiLegacyStreamResponse } from './types'; import { LokiQuery, LokiResultType, LokiResponse, LokiLegacyStreamResponse } from './types';
import { getQueryOptions } from 'test/helpers/getQueryOptions'; import { getQueryOptions } from 'test/helpers/getQueryOptions';
import { AnnotationQueryRequest, DataSourceApi, DataFrame, dateTime, TimeRange } from '@grafana/data';
import { BackendSrv } from 'app/core/services/backend_srv'; import { BackendSrv } from 'app/core/services/backend_srv';
import { AnnotationQueryRequest, DataSourceApi, DataFrame, dateTime, TimeRange, FieldCache } from '@grafana/data';
import { TemplateSrv } from 'app/features/templating/template_srv'; import { TemplateSrv } from 'app/features/templating/template_srv';
import { CustomVariable } from 'app/features/templating/custom_variable'; import { CustomVariable } from 'app/features/templating/custom_variable';
import { makeMockLokiDatasource } from './mocks'; import { makeMockLokiDatasource } from './mocks';
@@ -196,7 +196,8 @@ describe('LokiDatasource', () => {
const res = await ds.query(options).toPromise(); const res = await ds.query(options).toPromise();
const dataFrame = res.data[0] as DataFrame; const dataFrame = res.data[0] as DataFrame;
expect(dataFrame.fields[1].values.get(0)).toBe('hello'); const fieldCache = new FieldCache(dataFrame);
expect(fieldCache.getFieldByName('line').values.get(0)).toBe('hello');
expect(dataFrame.meta.limit).toBe(20); expect(dataFrame.meta.limit).toBe(20);
expect(dataFrame.meta.searchWords).toEqual(['foo']); expect(dataFrame.meta.searchWords).toEqual(['foo']);
}); });

View File

@@ -4,7 +4,7 @@ import { Observable, from, merge, of, iif, defer } from 'rxjs';
import { map, filter, catchError, switchMap, mergeMap } from 'rxjs/operators'; import { map, filter, catchError, switchMap, mergeMap } from 'rxjs/operators';
// Services & Utils // Services & Utils
import { dateMath } from '@grafana/data'; import { DataFrame, dateMath, FieldCache } from '@grafana/data';
import { addLabelToSelector, keepSelectorFilters } from 'app/plugins/datasource/prometheus/add_label_to_query'; import { addLabelToSelector, keepSelectorFilters } from 'app/plugins/datasource/prometheus/add_label_to_query';
import { BackendSrv, DatasourceRequestOptions } from 'app/core/services/backend_srv'; import { BackendSrv, DatasourceRequestOptions } from 'app/core/services/backend_srv';
import { TemplateSrv } from 'app/features/templating/template_srv'; import { TemplateSrv } from 'app/features/templating/template_srv';
@@ -34,6 +34,7 @@ import {
DataQueryRequest, DataQueryRequest,
DataQueryResponse, DataQueryResponse,
AnnotationQueryRequest, AnnotationQueryRequest,
ScopedVars,
} from '@grafana/data'; } from '@grafana/data';
import { import {
@@ -131,7 +132,7 @@ export class LokiDatasource extends DataSourceApi<LokiQuery, LokiOptions> {
.filter(target => target.expr && !target.hide) .filter(target => target.expr && !target.hide)
.map(target => ({ .map(target => ({
...target, ...target,
expr: this.templateSrv.replace(target.expr, {}, this.interpolateQueryExpr), expr: this.templateSrv.replace(target.expr, options.scopedVars, this.interpolateQueryExpr),
})); }));
if (options.exploreMode === ExploreMode.Metrics) { if (options.exploreMode === ExploreMode.Metrics) {
@@ -353,13 +354,13 @@ export class LokiDatasource extends DataSourceApi<LokiQuery, LokiOptions> {
); );
}; };
interpolateVariablesInQueries(queries: LokiQuery[]): LokiQuery[] { interpolateVariablesInQueries(queries: LokiQuery[], scopedVars: ScopedVars): LokiQuery[] {
let expandedQueries = queries; let expandedQueries = queries;
if (queries && queries.length) { if (queries && queries.length) {
expandedQueries = queries.map(query => ({ expandedQueries = queries.map(query => ({
...query, ...query,
datasource: this.name, datasource: this.name,
expr: this.templateSrv.replace(query.expr, {}, this.interpolateQueryExpr), expr: this.templateSrv.replace(query.expr, scopedVars, this.interpolateQueryExpr),
})); }));
} }
@@ -468,7 +469,7 @@ export class LokiDatasource extends DataSourceApi<LokiQuery, LokiOptions> {
return Math.ceil(date.valueOf() * 1e6); return Math.ceil(date.valueOf() * 1e6);
} }
getLogRowContext = (row: LogRowModel, options?: LokiContextQueryOptions) => { getLogRowContext = (row: LogRowModel, options?: LokiContextQueryOptions): Promise<{ data: DataFrame[] }> => {
const target = this.prepareLogRowContextQueryTarget( const target = this.prepareLogRowContextQueryTarget(
row, row,
(options && options.limit) || 10, (options && options.limit) || 10,
@@ -493,6 +494,7 @@ export class LokiDatasource extends DataSourceApi<LokiQuery, LokiOptions> {
switchMap((res: { data: LokiStreamResponse; status: number }) => switchMap((res: { data: LokiStreamResponse; status: number }) =>
iif( iif(
() => res.status === 404, () => res.status === 404,
defer(() =>
this._request(LEGACY_QUERY_ENDPOINT, target).pipe( this._request(LEGACY_QUERY_ENDPOINT, target).pipe(
catchError((err: any) => { catchError((err: any) => {
const error: DataQueryError = { const error: DataQueryError = {
@@ -505,13 +507,16 @@ export class LokiDatasource extends DataSourceApi<LokiQuery, LokiOptions> {
map((res: { data: LokiLegacyStreamResponse }) => ({ map((res: { data: LokiLegacyStreamResponse }) => ({
data: res.data ? res.data.streams.map(stream => legacyLogStreamToDataFrame(stream, reverse)) : [], data: res.data ? res.data.streams.map(stream => legacyLogStreamToDataFrame(stream, reverse)) : [],
})) }))
)
), ),
defer(() =>
of({ of({
data: res.data ? res.data.data.result.map(stream => lokiStreamResultToDataFrame(stream, reverse)) : [], data: res.data ? res.data.data.result.map(stream => lokiStreamResultToDataFrame(stream, reverse)) : [],
}) })
) )
) )
) )
)
.toPromise(); .toPromise();
}; };
@@ -520,8 +525,7 @@ export class LokiDatasource extends DataSourceApi<LokiQuery, LokiOptions> {
.map(label => `${label}="${row.labels[label]}"`) .map(label => `${label}="${row.labels[label]}"`)
.join(','); .join(',');
const contextTimeBuffer = 2 * 60 * 60 * 1000 * 1e6; // 2h buffer const contextTimeBuffer = 2 * 60 * 60 * 1000; // 2h buffer
const timeEpochNs = row.timeEpochMs * 1e6;
const commonTargetOptions = { const commonTargetOptions = {
limit, limit,
query: `{${query}}`, query: `{${query}}`,
@@ -529,18 +533,27 @@ export class LokiDatasource extends DataSourceApi<LokiQuery, LokiOptions> {
direction, direction,
}; };
const fieldCache = new FieldCache(row.dataFrame);
const nsField = fieldCache.getFieldByName('tsNs')!;
const nsTimestamp = nsField.values.get(row.rowIndex);
if (direction === 'BACKWARD') { if (direction === 'BACKWARD') {
return { return {
...commonTargetOptions, ...commonTargetOptions,
start: timeEpochNs - contextTimeBuffer, // convert to ns, we loose some precision here but it is not that important at the far points of the context
end: timeEpochNs, // using RFC3339Nano format to avoid precision loss start: row.timeEpochMs - contextTimeBuffer + '000000',
end: nsTimestamp,
direction, direction,
}; };
} else { } else {
return { return {
...commonTargetOptions, ...commonTargetOptions,
start: timeEpochNs, // start param in Loki API is inclusive so we'll have to filter out the row that this request is based from // start param in Loki API is inclusive so we'll have to filter out the row that this request is based from
end: timeEpochNs + contextTimeBuffer, // and any other that were logged in the same ns but before the row. Right now these rows will be lost
// because the are before but came it he response that should return only rows after.
start: nsTimestamp,
// convert to ns, we loose some precision here but it is not that important at the far points of the context
end: row.timeEpochMs + contextTimeBuffer + '000000',
}; };
} }
}; };

View File

@@ -67,6 +67,7 @@ export class LiveStreams {
const data = new CircularDataFrame({ capacity: target.size }); const data = new CircularDataFrame({ capacity: target.size });
data.addField({ name: 'ts', type: FieldType.time, config: { title: 'Time' } }); data.addField({ name: 'ts', type: FieldType.time, config: { title: 'Time' } });
data.addField({ name: 'tsNs', type: FieldType.time, config: { title: 'Time ns' } });
data.addField({ name: 'line', type: FieldType.string }).labels = parseLabels(target.query); data.addField({ name: 'line', type: FieldType.string }).labels = parseLabels(target.query);
data.addField({ name: 'labels', type: FieldType.other }); // The labels for each line data.addField({ name: 'labels', type: FieldType.other }); // The labels for each line
data.addField({ name: 'id', type: FieldType.string }); data.addField({ name: 'id', type: FieldType.string });

Some files were not shown because too many files have changed in this diff Show More