mirror of
https://github.com/grafana/grafana.git
synced 2025-12-21 03:54:29 +08:00
Compare commits
53 Commits
sriram/pos
...
v4.6.2
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
8db5f087eb | ||
|
|
8a1616305c | ||
|
|
7ca3d32a10 | ||
|
|
dba7383bb9 | ||
|
|
1ebea3775d | ||
|
|
5a80b0aede | ||
|
|
981f4f230c | ||
|
|
cac8b976d9 | ||
|
|
4366d2f281 | ||
|
|
c80995b067 | ||
|
|
ebaf8620ea | ||
|
|
64d8ae9dcb | ||
|
|
989bf2067c | ||
|
|
baeea98473 | ||
|
|
dce8522575 | ||
|
|
c47f670a58 | ||
|
|
fb50cec096 | ||
|
|
eb9bdb09b3 | ||
|
|
002ac79124 | ||
|
|
bd11b01aaa | ||
|
|
d7dd7e3c81 | ||
|
|
ea78d13b54 | ||
|
|
83c5853900 | ||
|
|
0f0be4e6e3 | ||
|
|
f308a25589 | ||
|
|
9a91a882b4 | ||
|
|
6ad6131aaf | ||
|
|
1f10928450 | ||
|
|
0cd0aa19d6 | ||
|
|
c2f2a43197 | ||
|
|
7fe1ac5fe7 | ||
|
|
aec448f7c6 | ||
|
|
2ce36c8670 | ||
|
|
fb35d839c1 | ||
|
|
c8f5d39d97 | ||
|
|
34bc19359d | ||
|
|
3dcae78126 | ||
|
|
054c7a154a | ||
|
|
6f3d61f4d2 | ||
|
|
305f8c10e9 | ||
|
|
fff4cfd11e | ||
|
|
d20434f828 | ||
|
|
1ef850fae0 | ||
|
|
2f7a59fb18 | ||
|
|
07be20eeb3 | ||
|
|
689b8d79df | ||
|
|
b70c538633 | ||
|
|
769bc5df21 | ||
|
|
4eb6d82254 | ||
|
|
7935739eb3 | ||
|
|
7403fa0fa7 | ||
|
|
5ebfd1e5ab | ||
|
|
a1a8c0fc07 |
@@ -7,7 +7,12 @@
|
||||
- UX changes to nav & side menu
|
||||
- New dashboard grid layout system
|
||||
|
||||
# 4.6.0 (unreleased)
|
||||
# 4.6.0-beta2 (2017-10-17)
|
||||
|
||||
## Fixes
|
||||
* **ColorPicker**: Fix for color picker not showing [#9549](https://github.com/grafana/grafana/issues/9549)
|
||||
|
||||
# 4.6.0-beta1 (2017-10-13)
|
||||
|
||||
## New Features
|
||||
* **GCS**: Adds support for Google Cloud Storage [#8370](https://github.com/grafana/grafana/issues/8370) thx [@chuhlomin](https://github.com/chuhlomin)
|
||||
|
||||
@@ -7,7 +7,7 @@ clone_folder: c:\gopath\src\github.com\grafana\grafana
|
||||
environment:
|
||||
nodejs_version: "6"
|
||||
GOPATH: c:\gopath
|
||||
GOVERSION: 1.9.1
|
||||
GOVERSION: 1.9.2
|
||||
|
||||
install:
|
||||
- rmdir c:\go /s /q
|
||||
|
||||
@@ -9,7 +9,7 @@ machine:
|
||||
GOPATH: "/home/ubuntu/.go_workspace"
|
||||
ORG_PATH: "github.com/grafana"
|
||||
REPO_PATH: "${ORG_PATH}/grafana"
|
||||
GODIST: "go1.9.1.linux-amd64.tar.gz"
|
||||
GODIST: "go1.9.2.linux-amd64.tar.gz"
|
||||
post:
|
||||
- mkdir -p ~/download
|
||||
- mkdir -p ~/docker
|
||||
|
||||
@@ -48,7 +48,7 @@ Macro example | Description
|
||||
*$__timeFilter(dateColumn)* | Will be replaced by a time range filter using the specified column name. For example, *dateColumn > to_timestamp(1494410783) AND dateColumn < to_timestamp(1494497183)*
|
||||
*$__timeFrom()* | Will be replaced by the start of the currently active time selection. For example, *to_timestamp(1494410783)*
|
||||
*$__timeTo()* | Will be replaced by the end of the currently active time selection. For example, *to_timestamp(1494497183)*
|
||||
*$__timeGroup(dateColumn,'5m')* | Will be replaced by an expression usable in GROUP BY clause. For example, *(extract(epoch from "dateColumn")/extract(epoch from '5m'::interval))::int*
|
||||
*$__timeGroup(dateColumn,'5m')* | Will be replaced by an expression usable in GROUP BY clause. For example, *(extract(epoch from "dateColumn")/extract(epoch from '5m'::interval))::int*extract(epoch from '5m'::interval)*
|
||||
*$__unixEpochFilter(dateColumn)* | Will be replaced by a time range filter using the specified column name with times represented as unix timestamp. For example, *dateColumn > 1494410783 AND dateColumn < 1494497183*
|
||||
*$__unixEpochFrom()* | Will be replaced by the start of the currently active time selection as unix timestamp. For example, *1494410783*
|
||||
*$__unixEpochTo()* | Will be replaced by the end of the currently active time selection as unix timestamp. For example, *1494497183*
|
||||
@@ -94,26 +94,26 @@ Example with `metric` column
|
||||
|
||||
```sql
|
||||
SELECT
|
||||
min(time_date_time) as time,
|
||||
$__timeGroup(time_date_time,'5m') as time,
|
||||
min(value_double),
|
||||
'min' as metric
|
||||
FROM test_data
|
||||
WHERE $__timeFilter(time_date_time)
|
||||
GROUP BY metric1, (extract(epoch from time_date_time)/extract(epoch from $__interval::interval))::int
|
||||
ORDER BY time asc
|
||||
GROUP BY time
|
||||
ORDER BY time
|
||||
```
|
||||
|
||||
Example with multiple columns:
|
||||
|
||||
```sql
|
||||
SELECT
|
||||
min(time_date_time) as time,
|
||||
$__timeGroup(time_date_time,'5m') as time,
|
||||
min(value_double) as min_value,
|
||||
max(value_double) as max_value
|
||||
FROM test_data
|
||||
WHERE $__timeFilter(time_date_time)
|
||||
GROUP BY metric1, (extract(epoch from time_date_time)/extract(epoch from $__interval::interval))::int
|
||||
ORDER BY time asc
|
||||
GROUP BY time
|
||||
ORDER BY time
|
||||
```
|
||||
|
||||
## Templating
|
||||
|
||||
@@ -34,6 +34,7 @@ Name | Description
|
||||
*Basic Auth* | Enable basic authentication to the Prometheus data source.
|
||||
*User* | Name of your Prometheus user
|
||||
*Password* | Database user's password
|
||||
*Scrape interval* | This will be used as a lower limit for the Prometheus step query parameter. Default value is 15s.
|
||||
|
||||
## Query editor
|
||||
|
||||
|
||||
@@ -120,6 +120,37 @@ Content-Type: application/json
|
||||
PUT /api/annotations/1141 HTTP/1.1
|
||||
Accept: application/json
|
||||
Content-Type: application/json
|
||||
|
||||
{
|
||||
"time":1507037197339,
|
||||
"isRegion":true,
|
||||
"timeEnd":1507180805056,
|
||||
"text":"Annotation Description",
|
||||
"tags":["tag3","tag4","tag5"]
|
||||
}
|
||||
```
|
||||
|
||||
## Delete Annotation By Id
|
||||
|
||||
`DELETE /api/annotation/:id`
|
||||
|
||||
Deletes the annotation that matches the specified id.
|
||||
|
||||
**Example Request**:
|
||||
|
||||
```http
|
||||
DELETE /api/annotation/1 HTTP/1.1
|
||||
Accept: application/json
|
||||
Content-Type: application/json
|
||||
Authorization: Bearer eyJrIjoiT0tTcG1pUlY2RnVKZTFVaDFsNFZXdE9ZWmNrMkZYbk
|
||||
```
|
||||
|
||||
**Example Response**:
|
||||
|
||||
```http
|
||||
HTTP/1.1 200
|
||||
Content-Type: application/json
|
||||
|
||||
```
|
||||
|
||||
## Delete Annotation By RegionId
|
||||
|
||||
@@ -13,7 +13,7 @@ dev environment. Grafana ships with its own required backend server; also comple
|
||||
|
||||
## Dependencies
|
||||
|
||||
- [Go 1.9.1](https://golang.org/dl/)
|
||||
- [Go 1.9.2](https://golang.org/dl/)
|
||||
- [NodeJS LTS](https://nodejs.org/download/)
|
||||
- [Git](https://git-scm.com/downloads)
|
||||
|
||||
|
||||
@@ -4,7 +4,7 @@
|
||||
"company": "Grafana Labs"
|
||||
},
|
||||
"name": "grafana",
|
||||
"version": "4.6.0-beta1",
|
||||
"version": "4.6.2",
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "http://github.com/grafana/grafana.git"
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
#! /usr/bin/env bash
|
||||
deb_ver=4.6.0-beta1
|
||||
rpm_ver=4.6.0-beta1
|
||||
deb_ver=4.6.0-beta3
|
||||
rpm_ver=4.6.0-beta3
|
||||
|
||||
wget https://s3-us-west-2.amazonaws.com/grafana-releases/release/grafana_${deb_ver}_amd64.deb
|
||||
|
||||
|
||||
@@ -1,7 +1,6 @@
|
||||
package api
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"strings"
|
||||
"time"
|
||||
|
||||
@@ -41,9 +40,22 @@ func GetAnnotations(c *middleware.Context) Response {
|
||||
return Json(200, items)
|
||||
}
|
||||
|
||||
type CreateAnnotationError struct {
|
||||
message string
|
||||
}
|
||||
|
||||
func (e *CreateAnnotationError) Error() string {
|
||||
return e.message
|
||||
}
|
||||
|
||||
func PostAnnotation(c *middleware.Context, cmd dtos.PostAnnotationsCmd) Response {
|
||||
repo := annotations.GetRepository()
|
||||
|
||||
if cmd.Text == "" {
|
||||
err := &CreateAnnotationError{"text field should not be empty"}
|
||||
return ApiError(500, "Failed to save annotation", err)
|
||||
}
|
||||
|
||||
item := annotations.Item{
|
||||
OrgId: c.OrgId,
|
||||
UserId: c.UserId,
|
||||
@@ -55,6 +67,10 @@ func PostAnnotation(c *middleware.Context, cmd dtos.PostAnnotationsCmd) Response
|
||||
Tags: cmd.Tags,
|
||||
}
|
||||
|
||||
if item.Epoch == 0 {
|
||||
item.Epoch = time.Now().Unix()
|
||||
}
|
||||
|
||||
if err := repo.Save(&item); err != nil {
|
||||
return ApiError(500, "Failed to save annotation", err)
|
||||
}
|
||||
@@ -82,21 +98,22 @@ func PostAnnotation(c *middleware.Context, cmd dtos.PostAnnotationsCmd) Response
|
||||
return ApiSuccess("Annotation added")
|
||||
}
|
||||
|
||||
type GraphiteAnnotationError struct {
|
||||
message string
|
||||
}
|
||||
|
||||
func (e *GraphiteAnnotationError) Error() string {
|
||||
return e.message
|
||||
}
|
||||
|
||||
func formatGraphiteAnnotation(what string, data string) string {
|
||||
return fmt.Sprintf("%s\n%s", what, data)
|
||||
text := what
|
||||
if data != "" {
|
||||
text = text + "\n" + data
|
||||
}
|
||||
return text
|
||||
}
|
||||
|
||||
func PostGraphiteAnnotation(c *middleware.Context, cmd dtos.PostGraphiteAnnotationsCmd) Response {
|
||||
repo := annotations.GetRepository()
|
||||
|
||||
if cmd.What == "" {
|
||||
err := &CreateAnnotationError{"what field should not be empty"}
|
||||
return ApiError(500, "Failed to save Graphite annotation", err)
|
||||
}
|
||||
|
||||
if cmd.When == 0 {
|
||||
cmd.When = time.Now().Unix()
|
||||
}
|
||||
@@ -106,18 +123,22 @@ func PostGraphiteAnnotation(c *middleware.Context, cmd dtos.PostGraphiteAnnotati
|
||||
var tagsArray []string
|
||||
switch tags := cmd.Tags.(type) {
|
||||
case string:
|
||||
if tags != "" {
|
||||
tagsArray = strings.Split(tags, " ")
|
||||
} else {
|
||||
tagsArray = []string{}
|
||||
}
|
||||
case []interface{}:
|
||||
for _, t := range tags {
|
||||
if tagStr, ok := t.(string); ok {
|
||||
tagsArray = append(tagsArray, tagStr)
|
||||
} else {
|
||||
err := &GraphiteAnnotationError{"tag should be a string"}
|
||||
err := &CreateAnnotationError{"tag should be a string"}
|
||||
return ApiError(500, "Failed to save Graphite annotation", err)
|
||||
}
|
||||
}
|
||||
default:
|
||||
err := &GraphiteAnnotationError{"unsupported tags format"}
|
||||
err := &CreateAnnotationError{"unsupported tags format"}
|
||||
return ApiError(500, "Failed to save Graphite annotation", err)
|
||||
}
|
||||
|
||||
@@ -133,7 +154,7 @@ func PostGraphiteAnnotation(c *middleware.Context, cmd dtos.PostGraphiteAnnotati
|
||||
return ApiError(500, "Failed to save Graphite annotation", err)
|
||||
}
|
||||
|
||||
return ApiSuccess("Graphite Annotation added")
|
||||
return ApiSuccess("Graphite annotation added")
|
||||
}
|
||||
|
||||
func UpdateAnnotation(c *middleware.Context, cmd dtos.UpdateAnnotationsCmd) Response {
|
||||
|
||||
@@ -267,7 +267,7 @@ func (hs *HttpServer) registerRoutes() {
|
||||
|
||||
apiRoute.Group("/alerts", func(alertsRoute RouteRegister) {
|
||||
alertsRoute.Post("/test", bind(dtos.AlertTestCommand{}), wrap(AlertTest))
|
||||
alertsRoute.Post("/:alertId/pause", bind(dtos.PauseAlertCommand{}), wrap(PauseAlert), reqEditorRole)
|
||||
alertsRoute.Post("/:alertId/pause", reqEditorRole, bind(dtos.PauseAlertCommand{}), wrap(PauseAlert))
|
||||
alertsRoute.Get("/:alertId", ValidateOrgAlert, wrap(GetAlert))
|
||||
alertsRoute.Get("/", wrap(GetAlerts))
|
||||
alertsRoute.Get("/states-for-dashboard", wrap(GetAlertStatesForDashboard))
|
||||
|
||||
@@ -188,9 +188,8 @@ func (hs *HttpServer) metricsEndpoint(ctx *macaron.Context) {
|
||||
return
|
||||
}
|
||||
|
||||
promhttp.HandlerFor(prometheus.DefaultGatherer, promhttp.HandlerOpts{
|
||||
DisableCompression: true,
|
||||
}).ServeHTTP(ctx.Resp, ctx.Req.Request)
|
||||
promhttp.HandlerFor(prometheus.DefaultGatherer, promhttp.HandlerOpts{}).
|
||||
ServeHTTP(ctx.Resp, ctx.Req.Request)
|
||||
}
|
||||
|
||||
func (hs *HttpServer) healthHandler(ctx *macaron.Context) {
|
||||
|
||||
@@ -21,6 +21,10 @@ func Gziper() macaron.Handler {
|
||||
return
|
||||
}
|
||||
|
||||
if strings.HasPrefix(requestPath, "/metrics") {
|
||||
return
|
||||
}
|
||||
|
||||
ctx.Invoke(macaronGziper)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -18,6 +18,7 @@ const (
|
||||
DS_KAIROSDB = "kairosdb"
|
||||
DS_PROMETHEUS = "prometheus"
|
||||
DS_POSTGRES = "postgres"
|
||||
DS_MYSQL = "mysql"
|
||||
DS_ACCESS_DIRECT = "direct"
|
||||
DS_ACCESS_PROXY = "proxy"
|
||||
)
|
||||
@@ -64,6 +65,7 @@ var knownDatasourcePlugins map[string]bool = map[string]bool{
|
||||
DS_PROMETHEUS: true,
|
||||
DS_OPENTSDB: true,
|
||||
DS_POSTGRES: true,
|
||||
DS_MYSQL: true,
|
||||
"opennms": true,
|
||||
"druid": true,
|
||||
"dalmatinerdb": true,
|
||||
|
||||
@@ -40,7 +40,7 @@ func getPluginLogoUrl(pluginType, path, baseUrl string) string {
|
||||
}
|
||||
|
||||
func (fp *FrontendPluginBase) setPathsBasedOnApp(app *AppPlugin) {
|
||||
appSubPath := strings.Replace(fp.PluginDir, app.PluginDir, "", 1)
|
||||
appSubPath := strings.Replace(strings.Replace(fp.PluginDir, app.PluginDir, "", 1), "\\", "/", 1)
|
||||
fp.IncludedInAppId = app.Id
|
||||
fp.BaseUrl = app.BaseUrl
|
||||
|
||||
|
||||
34
pkg/plugins/frontend_plugin_test.go
Normal file
34
pkg/plugins/frontend_plugin_test.go
Normal file
@@ -0,0 +1,34 @@
|
||||
package plugins
|
||||
|
||||
import (
|
||||
"testing"
|
||||
|
||||
"github.com/grafana/grafana/pkg/setting"
|
||||
. "github.com/smartystreets/goconvey/convey"
|
||||
)
|
||||
|
||||
func TestFrontendPlugin(t *testing.T) {
|
||||
|
||||
Convey("When setting paths based on App on Windows", t, func() {
|
||||
setting.StaticRootPath = "c:\\grafana\\public"
|
||||
|
||||
fp := &FrontendPluginBase{
|
||||
PluginBase: PluginBase{
|
||||
PluginDir: "c:\\grafana\\public\\app\\plugins\\app\\testdata\\datasource",
|
||||
BaseUrl: "fpbase",
|
||||
},
|
||||
}
|
||||
app := &AppPlugin{
|
||||
FrontendPluginBase: FrontendPluginBase{
|
||||
PluginBase: PluginBase{
|
||||
PluginDir: "c:\\grafana\\public\\app\\plugins\\app\\testdata",
|
||||
Id: "testdata",
|
||||
BaseUrl: "public/app/plugins/app/testdata",
|
||||
},
|
||||
},
|
||||
}
|
||||
fp.setPathsBasedOnApp(app)
|
||||
|
||||
So(fp.Module, ShouldEqual, "app/plugins/app/testdata/datasource/module")
|
||||
})
|
||||
}
|
||||
@@ -43,7 +43,7 @@ func (r *SqlAnnotationRepo) ensureTagsExist(sess *DBSession, tags []*models.Tag)
|
||||
var existingTag models.Tag
|
||||
|
||||
// check if it exists
|
||||
if exists, err := sess.Table("tag").Where("key=? AND value=?", tag.Key, tag.Value).Get(&existingTag); err != nil {
|
||||
if exists, err := sess.Table("tag").Where("`key`=? AND `value`=?", tag.Key, tag.Value).Get(&existingTag); err != nil {
|
||||
return nil, err
|
||||
} else if exists {
|
||||
tag.Id = existingTag.Id
|
||||
|
||||
@@ -17,6 +17,7 @@ import (
|
||||
"github.com/aws/aws-sdk-go/aws"
|
||||
"github.com/aws/aws-sdk-go/aws/request"
|
||||
"github.com/aws/aws-sdk-go/service/cloudwatch"
|
||||
"github.com/aws/aws-sdk-go/service/ec2/ec2iface"
|
||||
"github.com/grafana/grafana/pkg/components/null"
|
||||
"github.com/grafana/grafana/pkg/components/simplejson"
|
||||
"github.com/grafana/grafana/pkg/metrics"
|
||||
@@ -24,6 +25,7 @@ import (
|
||||
|
||||
type CloudWatchExecutor struct {
|
||||
*models.DataSource
|
||||
ec2Svc ec2iface.EC2API
|
||||
}
|
||||
|
||||
type DatasourceInfo struct {
|
||||
|
||||
@@ -183,6 +183,18 @@ func (e *CloudWatchExecutor) executeMetricFindQuery(ctx context.Context, queryCo
|
||||
data, err = e.handleGetEbsVolumeIds(ctx, parameters, queryContext)
|
||||
break
|
||||
case "ec2_instance_attribute":
|
||||
region := parameters.Get("region").MustString()
|
||||
dsInfo := e.getDsInfo(region)
|
||||
cfg, err := e.getAwsConfig(dsInfo)
|
||||
if err != nil {
|
||||
return nil, errors.New("Failed to call ec2:DescribeInstances")
|
||||
}
|
||||
sess, err := session.NewSession(cfg)
|
||||
if err != nil {
|
||||
return nil, errors.New("Failed to call ec2:DescribeInstances")
|
||||
}
|
||||
e.ec2Svc = ec2.New(sess, cfg)
|
||||
|
||||
data, err = e.handleGetEc2InstanceAttribute(ctx, parameters, queryContext)
|
||||
break
|
||||
}
|
||||
@@ -373,14 +385,16 @@ func (e *CloudWatchExecutor) handleGetEc2InstanceAttribute(ctx context.Context,
|
||||
|
||||
var filters []*ec2.Filter
|
||||
for k, v := range filterJson {
|
||||
if vv, ok := v.([]string); ok {
|
||||
var vvvv []*string
|
||||
if vv, ok := v.([]interface{}); ok {
|
||||
var vvvvv []*string
|
||||
for _, vvv := range vv {
|
||||
vvvv = append(vvvv, &vvv)
|
||||
if vvvv, ok := vvv.(string); ok {
|
||||
vvvvv = append(vvvvv, &vvvv)
|
||||
}
|
||||
}
|
||||
filters = append(filters, &ec2.Filter{
|
||||
Name: aws.String(k),
|
||||
Values: vvvv,
|
||||
Values: vvvvv,
|
||||
})
|
||||
}
|
||||
}
|
||||
@@ -467,24 +481,13 @@ func (e *CloudWatchExecutor) cloudwatchListMetrics(region string, namespace stri
|
||||
}
|
||||
|
||||
func (e *CloudWatchExecutor) ec2DescribeInstances(region string, filters []*ec2.Filter, instanceIds []*string) (*ec2.DescribeInstancesOutput, error) {
|
||||
dsInfo := e.getDsInfo(region)
|
||||
cfg, err := e.getAwsConfig(dsInfo)
|
||||
if err != nil {
|
||||
return nil, errors.New("Failed to call ec2:DescribeInstances")
|
||||
}
|
||||
sess, err := session.NewSession(cfg)
|
||||
if err != nil {
|
||||
return nil, errors.New("Failed to call ec2:DescribeInstances")
|
||||
}
|
||||
svc := ec2.New(sess, cfg)
|
||||
|
||||
params := &ec2.DescribeInstancesInput{
|
||||
Filters: filters,
|
||||
InstanceIds: instanceIds,
|
||||
}
|
||||
|
||||
var resp ec2.DescribeInstancesOutput
|
||||
err = svc.DescribeInstancesPages(params,
|
||||
err := e.ec2Svc.DescribeInstancesPages(params,
|
||||
func(page *ec2.DescribeInstancesOutput, lastPage bool) bool {
|
||||
reservations, _ := awsutil.ValuesAtPath(page, "Reservations")
|
||||
for _, reservation := range reservations {
|
||||
|
||||
@@ -1,13 +1,28 @@
|
||||
package cloudwatch
|
||||
|
||||
import (
|
||||
"context"
|
||||
"testing"
|
||||
|
||||
"github.com/aws/aws-sdk-go/aws"
|
||||
"github.com/aws/aws-sdk-go/service/cloudwatch"
|
||||
"github.com/aws/aws-sdk-go/service/ec2"
|
||||
"github.com/aws/aws-sdk-go/service/ec2/ec2iface"
|
||||
"github.com/grafana/grafana/pkg/components/simplejson"
|
||||
"github.com/grafana/grafana/pkg/tsdb"
|
||||
. "github.com/smartystreets/goconvey/convey"
|
||||
)
|
||||
|
||||
type mockedEc2 struct {
|
||||
ec2iface.EC2API
|
||||
Resp ec2.DescribeInstancesOutput
|
||||
}
|
||||
|
||||
func (m mockedEc2) DescribeInstancesPages(in *ec2.DescribeInstancesInput, fn func(*ec2.DescribeInstancesOutput, bool) bool) error {
|
||||
fn(&m.Resp, true)
|
||||
return nil
|
||||
}
|
||||
|
||||
func TestCloudWatchMetrics(t *testing.T) {
|
||||
|
||||
Convey("When calling getMetricsForCustomMetrics", t, func() {
|
||||
@@ -66,4 +81,37 @@ func TestCloudWatchMetrics(t *testing.T) {
|
||||
})
|
||||
})
|
||||
|
||||
Convey("When calling handleGetEc2InstanceAttribute", t, func() {
|
||||
executor := &CloudWatchExecutor{
|
||||
ec2Svc: mockedEc2{Resp: ec2.DescribeInstancesOutput{
|
||||
Reservations: []*ec2.Reservation{
|
||||
{
|
||||
Instances: []*ec2.Instance{
|
||||
{
|
||||
InstanceId: aws.String("i-12345678"),
|
||||
Tags: []*ec2.Tag{
|
||||
{
|
||||
Key: aws.String("Environment"),
|
||||
Value: aws.String("production"),
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
}},
|
||||
}
|
||||
|
||||
json := simplejson.New()
|
||||
json.Set("region", "us-east-1")
|
||||
json.Set("attributeName", "InstanceId")
|
||||
filters := make(map[string]interface{})
|
||||
filters["tag:Environment"] = []string{"production"}
|
||||
json.Set("filters", filters)
|
||||
result, _ := executor.handleGetEc2InstanceAttribute(context.Background(), json, &tsdb.TsdbQuery{})
|
||||
|
||||
Convey("Should equal production InstanceId", func() {
|
||||
So(result[0].Text, ShouldEqual, "i-12345678")
|
||||
})
|
||||
})
|
||||
}
|
||||
|
||||
@@ -2,9 +2,11 @@ package influxdb
|
||||
|
||||
import (
|
||||
"strconv"
|
||||
"time"
|
||||
|
||||
"github.com/grafana/grafana/pkg/components/simplejson"
|
||||
"github.com/grafana/grafana/pkg/models"
|
||||
"github.com/grafana/grafana/pkg/tsdb"
|
||||
)
|
||||
|
||||
type InfluxdbQueryParser struct{}
|
||||
@@ -37,13 +39,7 @@ func (qp *InfluxdbQueryParser) Parse(model *simplejson.Json, dsInfo *models.Data
|
||||
return nil, err
|
||||
}
|
||||
|
||||
interval := model.Get("interval").MustString("")
|
||||
if interval == "" && dsInfo.JsonData != nil {
|
||||
dsInterval := dsInfo.JsonData.Get("timeInterval").MustString("")
|
||||
if dsInterval != "" {
|
||||
interval = dsInterval
|
||||
}
|
||||
}
|
||||
parsedInterval, err := tsdb.GetIntervalFrom(dsInfo, model, time.Millisecond*1)
|
||||
|
||||
return &Query{
|
||||
Measurement: measurement,
|
||||
@@ -53,7 +49,7 @@ func (qp *InfluxdbQueryParser) Parse(model *simplejson.Json, dsInfo *models.Data
|
||||
Tags: tags,
|
||||
Selects: selects,
|
||||
RawQuery: rawQuery,
|
||||
Interval: interval,
|
||||
Interval: parsedInterval,
|
||||
Alias: alias,
|
||||
UseRawQuery: useRawQuery,
|
||||
}, nil
|
||||
|
||||
@@ -2,6 +2,7 @@ package influxdb
|
||||
|
||||
import (
|
||||
"testing"
|
||||
"time"
|
||||
|
||||
"github.com/grafana/grafana/pkg/components/simplejson"
|
||||
"github.com/grafana/grafana/pkg/models"
|
||||
@@ -115,7 +116,7 @@ func TestInfluxdbQueryParser(t *testing.T) {
|
||||
So(len(res.GroupBy), ShouldEqual, 3)
|
||||
So(len(res.Selects), ShouldEqual, 3)
|
||||
So(len(res.Tags), ShouldEqual, 2)
|
||||
So(res.Interval, ShouldEqual, ">20s")
|
||||
So(res.Interval, ShouldEqual, time.Second*20)
|
||||
So(res.Alias, ShouldEqual, "serie alias")
|
||||
})
|
||||
|
||||
@@ -174,7 +175,7 @@ func TestInfluxdbQueryParser(t *testing.T) {
|
||||
So(len(res.GroupBy), ShouldEqual, 2)
|
||||
So(len(res.Selects), ShouldEqual, 1)
|
||||
So(len(res.Tags), ShouldEqual, 0)
|
||||
So(res.Interval, ShouldEqual, ">10s")
|
||||
So(res.Interval, ShouldEqual, time.Second*10)
|
||||
})
|
||||
})
|
||||
}
|
||||
|
||||
@@ -1,5 +1,7 @@
|
||||
package influxdb
|
||||
|
||||
import "time"
|
||||
|
||||
type Query struct {
|
||||
Measurement string
|
||||
Policy string
|
||||
@@ -10,8 +12,7 @@ type Query struct {
|
||||
RawQuery string
|
||||
UseRawQuery bool
|
||||
Alias string
|
||||
|
||||
Interval string
|
||||
Interval time.Duration
|
||||
}
|
||||
|
||||
type Tag struct {
|
||||
|
||||
@@ -29,10 +29,8 @@ func (query *Query) Build(queryContext *tsdb.TsdbQuery) (string, error) {
|
||||
res += query.renderGroupBy(queryContext)
|
||||
}
|
||||
|
||||
interval, err := getDefinedInterval(query, queryContext)
|
||||
if err != nil {
|
||||
return "", err
|
||||
}
|
||||
calculator := tsdb.NewIntervalCalculator(&tsdb.IntervalOptions{})
|
||||
interval := calculator.Calculate(queryContext.TimeRange, query.Interval)
|
||||
|
||||
res = strings.Replace(res, "$timeFilter", query.renderTimeFilter(queryContext), -1)
|
||||
res = strings.Replace(res, "$interval", interval.Text, -1)
|
||||
@@ -41,29 +39,6 @@ func (query *Query) Build(queryContext *tsdb.TsdbQuery) (string, error) {
|
||||
return res, nil
|
||||
}
|
||||
|
||||
func getDefinedInterval(query *Query, queryContext *tsdb.TsdbQuery) (*tsdb.Interval, error) {
|
||||
defaultInterval := tsdb.CalculateInterval(queryContext.TimeRange)
|
||||
|
||||
if query.Interval == "" {
|
||||
return &defaultInterval, nil
|
||||
}
|
||||
|
||||
setInterval := strings.Replace(strings.Replace(query.Interval, "<", "", 1), ">", "", 1)
|
||||
parsedSetInterval, err := time.ParseDuration(setInterval)
|
||||
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
if strings.Contains(query.Interval, ">") {
|
||||
if defaultInterval.Value > parsedSetInterval {
|
||||
return &defaultInterval, nil
|
||||
}
|
||||
}
|
||||
|
||||
return &tsdb.Interval{Value: parsedSetInterval, Text: setInterval}, nil
|
||||
}
|
||||
|
||||
func (query *Query) renderTags() []string {
|
||||
var res []string
|
||||
for i, tag := range query.Tags {
|
||||
|
||||
@@ -2,6 +2,7 @@ package influxdb
|
||||
|
||||
import (
|
||||
"testing"
|
||||
"time"
|
||||
|
||||
"strings"
|
||||
|
||||
@@ -38,7 +39,7 @@ func TestInfluxdbQueryBuilder(t *testing.T) {
|
||||
Measurement: "cpu",
|
||||
Policy: "policy",
|
||||
GroupBy: []*QueryPart{groupBy1, groupBy3},
|
||||
Interval: "10s",
|
||||
Interval: time.Second * 10,
|
||||
}
|
||||
|
||||
rawQuery, err := query.Build(queryContext)
|
||||
@@ -52,7 +53,7 @@ func TestInfluxdbQueryBuilder(t *testing.T) {
|
||||
Measurement: "cpu",
|
||||
GroupBy: []*QueryPart{groupBy1, groupBy2, groupBy3},
|
||||
Tags: []*Tag{tag1, tag2},
|
||||
Interval: "5s",
|
||||
Interval: time.Second * 5,
|
||||
}
|
||||
|
||||
rawQuery, err := query.Build(queryContext)
|
||||
@@ -64,7 +65,7 @@ func TestInfluxdbQueryBuilder(t *testing.T) {
|
||||
query := &Query{
|
||||
Selects: []*Select{{*qp1, *qp2, *mathPartDivideBy100}},
|
||||
Measurement: "cpu",
|
||||
Interval: "5s",
|
||||
Interval: time.Second * 5,
|
||||
}
|
||||
|
||||
rawQuery, err := query.Build(queryContext)
|
||||
@@ -76,7 +77,7 @@ func TestInfluxdbQueryBuilder(t *testing.T) {
|
||||
query := &Query{
|
||||
Selects: []*Select{{*qp1, *qp2, *mathPartDivideByIntervalMs}},
|
||||
Measurement: "cpu",
|
||||
Interval: "5s",
|
||||
Interval: time.Second * 5,
|
||||
}
|
||||
|
||||
rawQuery, err := query.Build(queryContext)
|
||||
@@ -117,7 +118,7 @@ func TestInfluxdbQueryBuilder(t *testing.T) {
|
||||
Measurement: "cpu",
|
||||
Policy: "policy",
|
||||
GroupBy: []*QueryPart{groupBy1, groupBy3},
|
||||
Interval: "10s",
|
||||
Interval: time.Second * 10,
|
||||
RawQuery: "Raw query",
|
||||
UseRawQuery: true,
|
||||
}
|
||||
|
||||
@@ -2,14 +2,18 @@ package tsdb
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"strings"
|
||||
"time"
|
||||
|
||||
"github.com/grafana/grafana/pkg/components/simplejson"
|
||||
"github.com/grafana/grafana/pkg/models"
|
||||
)
|
||||
|
||||
var (
|
||||
defaultRes int64 = 1500
|
||||
minInterval time.Duration = 1 * time.Millisecond
|
||||
defaultMinInterval time.Duration = 1 * time.Millisecond
|
||||
year time.Duration = time.Hour * 24 * 365
|
||||
day time.Duration = time.Hour * 24 * 365
|
||||
day time.Duration = time.Hour * 24
|
||||
)
|
||||
|
||||
type Interval struct {
|
||||
@@ -17,14 +21,68 @@ type Interval struct {
|
||||
Value time.Duration
|
||||
}
|
||||
|
||||
func CalculateInterval(timerange *TimeRange) Interval {
|
||||
interval := time.Duration((timerange.MustGetTo().UnixNano() - timerange.MustGetFrom().UnixNano()) / defaultRes)
|
||||
|
||||
if interval < minInterval {
|
||||
return Interval{Text: formatDuration(minInterval), Value: interval}
|
||||
type intervalCalculator struct {
|
||||
minInterval time.Duration
|
||||
}
|
||||
|
||||
return Interval{Text: formatDuration(roundInterval(interval)), Value: interval}
|
||||
type IntervalCalculator interface {
|
||||
Calculate(timeRange *TimeRange, minInterval time.Duration) Interval
|
||||
}
|
||||
|
||||
type IntervalOptions struct {
|
||||
MinInterval time.Duration
|
||||
}
|
||||
|
||||
func NewIntervalCalculator(opt *IntervalOptions) *intervalCalculator {
|
||||
if opt == nil {
|
||||
opt = &IntervalOptions{}
|
||||
}
|
||||
|
||||
calc := &intervalCalculator{}
|
||||
|
||||
if opt.MinInterval == 0 {
|
||||
calc.minInterval = defaultMinInterval
|
||||
} else {
|
||||
calc.minInterval = opt.MinInterval
|
||||
}
|
||||
|
||||
return calc
|
||||
}
|
||||
|
||||
func (ic *intervalCalculator) Calculate(timerange *TimeRange, minInterval time.Duration) Interval {
|
||||
to := timerange.MustGetTo().UnixNano()
|
||||
from := timerange.MustGetFrom().UnixNano()
|
||||
interval := time.Duration((to - from) / defaultRes)
|
||||
|
||||
if interval < minInterval {
|
||||
return Interval{Text: formatDuration(minInterval), Value: minInterval}
|
||||
}
|
||||
|
||||
rounded := roundInterval(interval)
|
||||
return Interval{Text: formatDuration(rounded), Value: rounded}
|
||||
}
|
||||
|
||||
func GetIntervalFrom(dsInfo *models.DataSource, queryModel *simplejson.Json, defaultInterval time.Duration) (time.Duration, error) {
|
||||
interval := queryModel.Get("interval").MustString("")
|
||||
|
||||
if interval == "" && dsInfo.JsonData != nil {
|
||||
dsInterval := dsInfo.JsonData.Get("timeInterval").MustString("")
|
||||
if dsInterval != "" {
|
||||
interval = dsInterval
|
||||
}
|
||||
}
|
||||
|
||||
if interval == "" {
|
||||
return defaultInterval, nil
|
||||
}
|
||||
|
||||
interval = strings.Replace(strings.Replace(interval, "<", "", 1), ">", "", 1)
|
||||
parsedInterval, err := time.ParseDuration(interval)
|
||||
if err != nil {
|
||||
return time.Duration(0), err
|
||||
}
|
||||
|
||||
return parsedInterval, nil
|
||||
}
|
||||
|
||||
func formatDuration(inter time.Duration) string {
|
||||
|
||||
@@ -14,31 +14,33 @@ func TestInterval(t *testing.T) {
|
||||
HomePath: "../../",
|
||||
})
|
||||
|
||||
calculator := NewIntervalCalculator(&IntervalOptions{})
|
||||
|
||||
Convey("for 5min", func() {
|
||||
tr := NewTimeRange("5m", "now")
|
||||
|
||||
interval := CalculateInterval(tr)
|
||||
interval := calculator.Calculate(tr, time.Millisecond*1)
|
||||
So(interval.Text, ShouldEqual, "200ms")
|
||||
})
|
||||
|
||||
Convey("for 15min", func() {
|
||||
tr := NewTimeRange("15m", "now")
|
||||
|
||||
interval := CalculateInterval(tr)
|
||||
interval := calculator.Calculate(tr, time.Millisecond*1)
|
||||
So(interval.Text, ShouldEqual, "500ms")
|
||||
})
|
||||
|
||||
Convey("for 30min", func() {
|
||||
tr := NewTimeRange("30m", "now")
|
||||
|
||||
interval := CalculateInterval(tr)
|
||||
interval := calculator.Calculate(tr, time.Millisecond*1)
|
||||
So(interval.Text, ShouldEqual, "1s")
|
||||
})
|
||||
|
||||
Convey("for 1h", func() {
|
||||
tr := NewTimeRange("1h", "now")
|
||||
|
||||
interval := CalculateInterval(tr)
|
||||
interval := calculator.Calculate(tr, time.Millisecond*1)
|
||||
So(interval.Text, ShouldEqual, "2s")
|
||||
})
|
||||
|
||||
@@ -51,6 +53,7 @@ func TestInterval(t *testing.T) {
|
||||
So(formatDuration(time.Second*61), ShouldEqual, "1m")
|
||||
So(formatDuration(time.Millisecond*30), ShouldEqual, "30ms")
|
||||
So(formatDuration(time.Hour*23), ShouldEqual, "23h")
|
||||
So(formatDuration(time.Hour*24), ShouldEqual, "1d")
|
||||
So(formatDuration(time.Hour*24*367), ShouldEqual, "1y")
|
||||
})
|
||||
})
|
||||
|
||||
@@ -74,7 +74,7 @@ func (m *PostgresMacroEngine) evaluateMacro(name string, args []string) (string,
|
||||
if len(args) == 0 {
|
||||
return "", fmt.Errorf("missing time column argument for macro %v", name)
|
||||
}
|
||||
return fmt.Sprintf("%s >= to_timestamp(%d) AND %s <= to_timestamp(%d)", args[0], uint64(m.TimeRange.GetFromAsMsEpoch()/1000), args[0], uint64(m.TimeRange.GetToAsMsEpoch()/1000)), nil
|
||||
return fmt.Sprintf("extract(epoch from %s) BETWEEN %d AND %d", args[0], uint64(m.TimeRange.GetFromAsMsEpoch()/1000), uint64(m.TimeRange.GetToAsMsEpoch()/1000)), nil
|
||||
case "__timeFrom":
|
||||
return fmt.Sprintf("to_timestamp(%d)", uint64(m.TimeRange.GetFromAsMsEpoch()/1000)), nil
|
||||
case "__timeTo":
|
||||
@@ -83,7 +83,7 @@ func (m *PostgresMacroEngine) evaluateMacro(name string, args []string) (string,
|
||||
if len(args) < 2 {
|
||||
return "", fmt.Errorf("macro %v needs time column and interval", name)
|
||||
}
|
||||
return fmt.Sprintf("(extract(epoch from \"%s\")/extract(epoch from %s::interval))::int", args[0], args[1]), nil
|
||||
return fmt.Sprintf("(extract(epoch from \"%s\")/extract(epoch from %s::interval))::int*extract(epoch from %s::interval)", args[0], args[1], args[1]), nil
|
||||
case "__unixEpochFilter":
|
||||
if len(args) == 0 {
|
||||
return "", fmt.Errorf("missing time column argument for macro %v", name)
|
||||
|
||||
@@ -30,7 +30,7 @@ func TestMacroEngine(t *testing.T) {
|
||||
sql, err := engine.Interpolate(timeRange, "WHERE $__timeFilter(time_column)")
|
||||
So(err, ShouldBeNil)
|
||||
|
||||
So(sql, ShouldEqual, "WHERE time_column >= to_timestamp(18446744066914186738) AND time_column <= to_timestamp(18446744066914187038)")
|
||||
So(sql, ShouldEqual, "WHERE extract(epoch from time_column) BETWEEN 18446744066914186738 AND 18446744066914187038")
|
||||
})
|
||||
|
||||
Convey("interpolate __timeFrom function", func() {
|
||||
@@ -45,7 +45,7 @@ func TestMacroEngine(t *testing.T) {
|
||||
sql, err := engine.Interpolate(timeRange, "GROUP BY $__timeGroup(time_column,'5m')")
|
||||
So(err, ShouldBeNil)
|
||||
|
||||
So(sql, ShouldEqual, "GROUP BY (extract(epoch from \"time_column\")/extract(epoch from '5m'::interval))::int")
|
||||
So(sql, ShouldEqual, "GROUP BY (extract(epoch from \"time_column\")/extract(epoch from '5m'::interval))::int*extract(epoch from '5m'::interval)")
|
||||
})
|
||||
|
||||
Convey("interpolate __timeTo function", func() {
|
||||
|
||||
@@ -50,12 +50,14 @@ func NewPrometheusExecutor(dsInfo *models.DataSource) (tsdb.TsdbQueryEndpoint, e
|
||||
var (
|
||||
plog log.Logger
|
||||
legendFormat *regexp.Regexp
|
||||
intervalCalculator tsdb.IntervalCalculator
|
||||
)
|
||||
|
||||
func init() {
|
||||
plog = log.New("tsdb.prometheus")
|
||||
tsdb.RegisterTsdbQueryEndpoint("prometheus", NewPrometheusExecutor)
|
||||
legendFormat = regexp.MustCompile(`\{\{\s*(.+?)\s*\}\}`)
|
||||
intervalCalculator = tsdb.NewIntervalCalculator(&tsdb.IntervalOptions{MinInterval: time.Second * 1})
|
||||
}
|
||||
|
||||
func (e *PrometheusExecutor) getClient(dsInfo *models.DataSource) (apiv1.API, error) {
|
||||
@@ -88,7 +90,7 @@ func (e *PrometheusExecutor) Query(ctx context.Context, dsInfo *models.DataSourc
|
||||
return nil, err
|
||||
}
|
||||
|
||||
query, err := parseQuery(tsdbQuery.Queries, tsdbQuery)
|
||||
query, err := parseQuery(dsInfo, tsdbQuery.Queries, tsdbQuery)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
@@ -138,7 +140,7 @@ func formatLegend(metric model.Metric, query *PrometheusQuery) string {
|
||||
return string(result)
|
||||
}
|
||||
|
||||
func parseQuery(queries []*tsdb.Query, queryContext *tsdb.TsdbQuery) (*PrometheusQuery, error) {
|
||||
func parseQuery(dsInfo *models.DataSource, queries []*tsdb.Query, queryContext *tsdb.TsdbQuery) (*PrometheusQuery, error) {
|
||||
queryModel := queries[0]
|
||||
|
||||
expr, err := queryModel.Model.Get("expr").String()
|
||||
@@ -146,11 +148,6 @@ func parseQuery(queries []*tsdb.Query, queryContext *tsdb.TsdbQuery) (*Prometheu
|
||||
return nil, err
|
||||
}
|
||||
|
||||
step, err := queryModel.Model.Get("step").Int64()
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
format := queryModel.Model.Get("legendFormat").MustString("")
|
||||
|
||||
start, err := queryContext.TimeRange.ParseFrom()
|
||||
@@ -163,9 +160,18 @@ func parseQuery(queries []*tsdb.Query, queryContext *tsdb.TsdbQuery) (*Prometheu
|
||||
return nil, err
|
||||
}
|
||||
|
||||
dsInterval, err := tsdb.GetIntervalFrom(dsInfo, queryModel.Model, time.Second*15)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
intervalFactor := queryModel.Model.Get("intervalFactor").MustInt64(1)
|
||||
interval := intervalCalculator.Calculate(queryContext.TimeRange, dsInterval)
|
||||
step := time.Duration(int64(interval.Value) * intervalFactor)
|
||||
|
||||
return &PrometheusQuery{
|
||||
Expr: expr,
|
||||
Step: time.Second * time.Duration(step),
|
||||
Step: step,
|
||||
LegendFormat: format,
|
||||
Start: start,
|
||||
End: end,
|
||||
|
||||
@@ -2,13 +2,21 @@ package prometheus
|
||||
|
||||
import (
|
||||
"testing"
|
||||
"time"
|
||||
|
||||
"github.com/grafana/grafana/pkg/models"
|
||||
"github.com/grafana/grafana/pkg/tsdb"
|
||||
|
||||
"github.com/grafana/grafana/pkg/components/simplejson"
|
||||
p "github.com/prometheus/common/model"
|
||||
. "github.com/smartystreets/goconvey/convey"
|
||||
)
|
||||
|
||||
func TestPrometheus(t *testing.T) {
|
||||
Convey("Prometheus", t, func() {
|
||||
dsInfo := &models.DataSource{
|
||||
JsonData: simplejson.New(),
|
||||
}
|
||||
|
||||
Convey("converting metric name", func() {
|
||||
metric := map[p.LabelName]p.LabelValue{
|
||||
@@ -36,5 +44,108 @@ func TestPrometheus(t *testing.T) {
|
||||
|
||||
So(formatLegend(metric, query), ShouldEqual, `http_request_total{app="backend", device="mobile"}`)
|
||||
})
|
||||
|
||||
Convey("parsing query model with step", func() {
|
||||
json := `{
|
||||
"expr": "go_goroutines",
|
||||
"format": "time_series",
|
||||
"refId": "A"
|
||||
}`
|
||||
jsonModel, _ := simplejson.NewJson([]byte(json))
|
||||
queryContext := &tsdb.TsdbQuery{}
|
||||
queryModels := []*tsdb.Query{
|
||||
{Model: jsonModel},
|
||||
}
|
||||
|
||||
Convey("with 48h time range", func() {
|
||||
queryContext.TimeRange = tsdb.NewTimeRange("12h", "now")
|
||||
|
||||
model, err := parseQuery(dsInfo, queryModels, queryContext)
|
||||
|
||||
So(err, ShouldBeNil)
|
||||
So(model.Step, ShouldEqual, time.Second*30)
|
||||
})
|
||||
})
|
||||
|
||||
Convey("parsing query model without step parameter", func() {
|
||||
json := `{
|
||||
"expr": "go_goroutines",
|
||||
"format": "time_series",
|
||||
"intervalFactor": 1,
|
||||
"refId": "A"
|
||||
}`
|
||||
jsonModel, _ := simplejson.NewJson([]byte(json))
|
||||
queryContext := &tsdb.TsdbQuery{}
|
||||
queryModels := []*tsdb.Query{
|
||||
{Model: jsonModel},
|
||||
}
|
||||
|
||||
Convey("with 48h time range", func() {
|
||||
queryContext.TimeRange = tsdb.NewTimeRange("48h", "now")
|
||||
|
||||
model, err := parseQuery(dsInfo, queryModels, queryContext)
|
||||
|
||||
So(err, ShouldBeNil)
|
||||
So(model.Step, ShouldEqual, time.Minute*2)
|
||||
})
|
||||
|
||||
Convey("with 1h time range", func() {
|
||||
queryContext.TimeRange = tsdb.NewTimeRange("1h", "now")
|
||||
|
||||
model, err := parseQuery(dsInfo, queryModels, queryContext)
|
||||
|
||||
So(err, ShouldBeNil)
|
||||
So(model.Step, ShouldEqual, time.Second*15)
|
||||
})
|
||||
})
|
||||
|
||||
Convey("parsing query model with intervalFactor", func() {
|
||||
Convey("high intervalFactor", func() {
|
||||
json := `{
|
||||
"expr": "go_goroutines",
|
||||
"format": "time_series",
|
||||
"intervalFactor": 10,
|
||||
"refId": "A"
|
||||
}`
|
||||
jsonModel, _ := simplejson.NewJson([]byte(json))
|
||||
queryContext := &tsdb.TsdbQuery{}
|
||||
queryModels := []*tsdb.Query{
|
||||
{Model: jsonModel},
|
||||
}
|
||||
|
||||
Convey("with 48h time range", func() {
|
||||
queryContext.TimeRange = tsdb.NewTimeRange("48h", "now")
|
||||
|
||||
model, err := parseQuery(dsInfo, queryModels, queryContext)
|
||||
|
||||
So(err, ShouldBeNil)
|
||||
So(model.Step, ShouldEqual, time.Minute*20)
|
||||
})
|
||||
})
|
||||
|
||||
Convey("low intervalFactor", func() {
|
||||
json := `{
|
||||
"expr": "go_goroutines",
|
||||
"format": "time_series",
|
||||
"intervalFactor": 1,
|
||||
"refId": "A"
|
||||
}`
|
||||
jsonModel, _ := simplejson.NewJson([]byte(json))
|
||||
queryContext := &tsdb.TsdbQuery{}
|
||||
queryModels := []*tsdb.Query{
|
||||
{Model: jsonModel},
|
||||
}
|
||||
|
||||
Convey("with 48h time range", func() {
|
||||
queryContext.TimeRange = tsdb.NewTimeRange("48h", "now")
|
||||
|
||||
model, err := parseQuery(dsInfo, queryModels, queryContext)
|
||||
|
||||
So(err, ShouldBeNil)
|
||||
So(model.Step, ShouldEqual, time.Minute*2)
|
||||
})
|
||||
})
|
||||
})
|
||||
|
||||
})
|
||||
}
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
import React from 'react';
|
||||
import coreModule from '../core_module';
|
||||
import { react2AngularDirective } from 'app/core/utils/react2angular';
|
||||
|
||||
export interface IProps {
|
||||
password: string;
|
||||
@@ -33,7 +33,5 @@ export class PasswordStrength extends React.Component<IProps, any> {
|
||||
}
|
||||
}
|
||||
|
||||
coreModule.directive('passwordStrength', function(reactDirective) {
|
||||
return reactDirective(PasswordStrength, ['password']);
|
||||
});
|
||||
react2AngularDirective('passwordStrength', PasswordStrength, ['password']);
|
||||
|
||||
|
||||
@@ -1,5 +1,4 @@
|
||||
import React from 'react';
|
||||
import coreModule from 'app/core/core_module';
|
||||
import { sortedColors } from 'app/core/utils/colors';
|
||||
|
||||
export interface IProps {
|
||||
@@ -23,12 +22,15 @@ export class GfColorPalette extends React.Component<IProps, any> {
|
||||
}
|
||||
|
||||
render() {
|
||||
const colorPaletteItems = this.paletteColors.map((paletteColor) => {
|
||||
const colorPaletteItems = this.paletteColors.map(paletteColor => {
|
||||
const cssClass = paletteColor.toLowerCase() === this.props.color.toLowerCase() ? 'fa-circle-o' : 'fa-circle';
|
||||
return (
|
||||
<i key={paletteColor} className={"pointer fa " + cssClass}
|
||||
style={{'color': paletteColor}}
|
||||
onClick={this.onColorSelect(paletteColor)}>
|
||||
<i
|
||||
key={paletteColor}
|
||||
className={'pointer fa ' + cssClass}
|
||||
style={{ color: paletteColor }}
|
||||
onClick={this.onColorSelect(paletteColor)}>
|
||||
|
||||
</i>
|
||||
);
|
||||
});
|
||||
@@ -40,6 +42,3 @@ export class GfColorPalette extends React.Component<IProps, any> {
|
||||
}
|
||||
}
|
||||
|
||||
coreModule.directive('gfColorPalette', function (reactDirective) {
|
||||
return reactDirective(GfColorPalette, ['color', 'onColorSelect']);
|
||||
});
|
||||
|
||||
@@ -2,8 +2,8 @@ import React from 'react';
|
||||
import ReactDOM from 'react-dom';
|
||||
import $ from 'jquery';
|
||||
import Drop from 'tether-drop';
|
||||
import coreModule from 'app/core/core_module';
|
||||
import { ColorPickerPopover } from './ColorPickerPopover';
|
||||
import { react2AngularDirective } from 'app/core/utils/react2angular';
|
||||
|
||||
export interface IProps {
|
||||
color: string;
|
||||
@@ -27,9 +27,7 @@ export class ColorPicker extends React.Component<IProps, any> {
|
||||
}
|
||||
|
||||
openColorPicker() {
|
||||
const dropContent = (
|
||||
<ColorPickerPopover color={this.props.color} onColorSelect={this.onColorSelect} />
|
||||
);
|
||||
const dropContent = <ColorPickerPopover color={this.props.color} onColorSelect={this.onColorSelect} />;
|
||||
|
||||
let dropContentElem = document.createElement('div');
|
||||
ReactDOM.render(dropContent, dropContentElem);
|
||||
@@ -38,12 +36,12 @@ export class ColorPicker extends React.Component<IProps, any> {
|
||||
target: this.pickerElem[0],
|
||||
content: dropContentElem,
|
||||
position: 'top center',
|
||||
classes: 'drop-popover drop-popover--form',
|
||||
openOn: 'hover',
|
||||
classes: 'drop-popover',
|
||||
openOn: 'click',
|
||||
hoverCloseDelay: 200,
|
||||
tetherOptions: {
|
||||
constraints: [{ to: 'scrollParent', attachment: "none both" }]
|
||||
}
|
||||
constraints: [{ to: 'scrollParent', attachment: 'none both' }],
|
||||
},
|
||||
});
|
||||
|
||||
drop.on('close', this.closeColorPicker);
|
||||
@@ -68,17 +66,14 @@ export class ColorPicker extends React.Component<IProps, any> {
|
||||
return (
|
||||
<div className="sp-replacer sp-light" onClick={this.openColorPicker} ref={this.setPickerElem}>
|
||||
<div className="sp-preview">
|
||||
<div className="sp-preview-inner" style={{backgroundColor: this.props.color}}>
|
||||
</div>
|
||||
<div className="sp-preview-inner" style={{ backgroundColor: this.props.color }} />
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
coreModule.directive('colorPicker', function (reactDirective) {
|
||||
return reactDirective(ColorPicker, [
|
||||
react2AngularDirective('colorPicker', ColorPicker, [
|
||||
'color',
|
||||
['onChange', { watchDepth: 'reference', wrapApply: true }]
|
||||
['onChange', { watchDepth: 'reference', wrapApply: true }],
|
||||
]);
|
||||
});
|
||||
|
||||
@@ -1,7 +1,6 @@
|
||||
import React from 'react';
|
||||
import $ from 'jquery';
|
||||
import tinycolor from 'tinycolor2';
|
||||
import coreModule from 'app/core/core_module';
|
||||
import { GfColorPalette } from './ColorPalette';
|
||||
import { GfSpectrumPicker } from './SpectrumPicker';
|
||||
|
||||
@@ -57,10 +56,11 @@ export class ColorPickerPopover extends React.Component<IProps, any> {
|
||||
let newColor = tinycolor(colorString);
|
||||
if (newColor.isValid()) {
|
||||
// Update only color state
|
||||
let newColorString = newColor.toString();
|
||||
this.setState({
|
||||
color: newColor.toString(),
|
||||
color: newColorString,
|
||||
});
|
||||
this.props.onColorSelect(newColor);
|
||||
this.props.onColorSelect(newColorString);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -115,7 +115,3 @@ export class ColorPickerPopover extends React.Component<IProps, any> {
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
coreModule.directive('gfColorPickerPopover', function (reactDirective) {
|
||||
return reactDirective(ColorPickerPopover, ['color', 'onColorSelect']);
|
||||
});
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
import React from 'react';
|
||||
import coreModule from 'app/core/core_module';
|
||||
import { ColorPickerPopover } from './ColorPickerPopover';
|
||||
import { react2AngularDirective } from 'app/core/utils/react2angular';
|
||||
|
||||
export interface IProps {
|
||||
series: any;
|
||||
@@ -43,13 +43,11 @@ export class SeriesColorPicker extends React.Component<IProps, any> {
|
||||
render() {
|
||||
return (
|
||||
<div className="graph-legend-popover">
|
||||
{this.props.series && this.renderAxisSelection()}
|
||||
{this.props.series.yaxis && this.renderAxisSelection()}
|
||||
<ColorPickerPopover color={this.props.series.color} onColorSelect={this.onColorChange} />
|
||||
</div>
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
coreModule.directive('seriesColorPicker', function(reactDirective) {
|
||||
return reactDirective(SeriesColorPicker, ['series', 'onColorChange', 'onToggleAxis']);
|
||||
});
|
||||
react2AngularDirective('seriesColorPicker', SeriesColorPicker, ['series', 'onColorChange', 'onToggleAxis']);
|
||||
|
||||
@@ -1,5 +1,4 @@
|
||||
import React from 'react';
|
||||
import coreModule from 'app/core/core_module';
|
||||
import _ from 'lodash';
|
||||
import $ from 'jquery';
|
||||
import 'vendor/spectrum';
|
||||
@@ -71,6 +70,3 @@ export class GfSpectrumPicker extends React.Component<IProps, any> {
|
||||
}
|
||||
}
|
||||
|
||||
coreModule.directive('gfSpectrumPicker', function (reactDirective) {
|
||||
return reactDirective(GfSpectrumPicker, ['color', 'options', 'onColorSelect']);
|
||||
});
|
||||
|
||||
@@ -5,6 +5,7 @@
|
||||
*/
|
||||
import coreModule from '../../core_module';
|
||||
|
||||
/** @ngInject */
|
||||
export function spectrumPicker() {
|
||||
return {
|
||||
restrict: 'E',
|
||||
|
||||
@@ -159,6 +159,8 @@ export class FormDropdownCtrl {
|
||||
}
|
||||
|
||||
updateValue(text) {
|
||||
text = _.unescape(text);
|
||||
|
||||
if (text === '' || this.text === text) {
|
||||
return;
|
||||
}
|
||||
|
||||
@@ -31,8 +31,8 @@ function (_, $, coreModule) {
|
||||
}
|
||||
});
|
||||
|
||||
$scope.$watch('playlistSrv', function(newValue) {
|
||||
elem.toggleClass('playlist-active', _.isObject(newValue));
|
||||
$scope.$watch('playlistSrv.isPlaying', function(newValue) {
|
||||
elem.toggleClass('playlist-active', newValue === true);
|
||||
});
|
||||
}
|
||||
};
|
||||
|
||||
@@ -7,14 +7,15 @@ export class DeltaCtrl {
|
||||
observer: any;
|
||||
|
||||
/** @ngInject */
|
||||
constructor($rootScope) {
|
||||
const waitForCompile = function(mutations) {
|
||||
constructor(private $rootScope) {
|
||||
|
||||
const waitForCompile = (mutations) => {
|
||||
if (mutations.length === 1) {
|
||||
this.$rootScope.appEvent('json-diff-ready');
|
||||
}
|
||||
};
|
||||
|
||||
this.observer = new MutationObserver(waitForCompile.bind(this));
|
||||
this.observer = new MutationObserver(waitForCompile);
|
||||
|
||||
const observerConfig = {
|
||||
attributes: true,
|
||||
|
||||
@@ -39,6 +39,8 @@ function (_, $, coreModule) {
|
||||
return;
|
||||
}
|
||||
|
||||
value = _.unescape(value);
|
||||
|
||||
$scope.$apply(function() {
|
||||
var selected = _.find($scope.altSegments, {value: value});
|
||||
if (selected) {
|
||||
|
||||
@@ -1,7 +1,6 @@
|
||||
import _ from 'lodash';
|
||||
import moment from 'moment';
|
||||
|
||||
declare var window: any;
|
||||
import {saveAs} from 'file-saver';
|
||||
|
||||
const DEFAULT_DATETIME_FORMAT = 'YYYY-MM-DDTHH:mm:ssZ';
|
||||
|
||||
@@ -69,5 +68,5 @@ export function exportTableDataToCsv(table, excel = false) {
|
||||
|
||||
export function saveSaveBlob(payload, fname) {
|
||||
var blob = new Blob([payload], { type: "text/csv;charset=utf-8" });
|
||||
window.saveAs(blob, fname);
|
||||
saveAs(blob, fname);
|
||||
}
|
||||
|
||||
10
public/app/core/utils/react2angular.ts
Normal file
10
public/app/core/utils/react2angular.ts
Normal file
@@ -0,0 +1,10 @@
|
||||
import coreModule from 'app/core/core_module';
|
||||
|
||||
export function react2AngularDirective(name: string, component: any, options: any) {
|
||||
|
||||
coreModule.directive(name, ['reactDirective', reactDirective => {
|
||||
return reactDirective(component, options);
|
||||
}]);
|
||||
|
||||
}
|
||||
|
||||
@@ -383,6 +383,7 @@ export class AlertTabCtrl {
|
||||
|
||||
test() {
|
||||
this.testing = true;
|
||||
this.testResult = false;
|
||||
|
||||
var payload = {
|
||||
dashboard: this.dashboardSrv.getCurrent().getSaveModelClone(),
|
||||
|
||||
@@ -39,7 +39,7 @@ export function annotationTooltipDirective($sanitize, dashboardSrv, contextSrv,
|
||||
text = text + '<br />' + event.text;
|
||||
}
|
||||
} else if (title) {
|
||||
text = title + '<br />' + text;
|
||||
text = title + '<br />' + (_.isString(text) ? text : '');
|
||||
title = '';
|
||||
}
|
||||
|
||||
|
||||
@@ -20,10 +20,12 @@ export class DashboardCtrl {
|
||||
dynamicDashboardSrv,
|
||||
dashboardViewStateSrv,
|
||||
contextSrv,
|
||||
playlistSrv,
|
||||
alertSrv,
|
||||
$timeout) {
|
||||
|
||||
$scope.editor = { index: 0 };
|
||||
$scope.playlistSrv = playlistSrv;
|
||||
|
||||
var resizeEventTimeout;
|
||||
|
||||
|
||||
@@ -19,6 +19,7 @@ export class DashNavCtrl {
|
||||
private $location,
|
||||
private backendSrv,
|
||||
private contextSrv,
|
||||
public playlistSrv,
|
||||
navModelSrv) {
|
||||
this.navModel = navModelSrv.getDashboardNav(this.dashboard, this);
|
||||
|
||||
|
||||
@@ -1,8 +1,7 @@
|
||||
///<reference path="../../../headers/common.d.ts" />
|
||||
|
||||
import angular from 'angular';
|
||||
import coreModule from 'app/core/core_module';
|
||||
import {saveAs} from 'file-saver';
|
||||
|
||||
import coreModule from 'app/core/core_module';
|
||||
import {DashboardExporter} from './exporter';
|
||||
|
||||
export class DashExportCtrl {
|
||||
@@ -22,9 +21,8 @@ export class DashExportCtrl {
|
||||
}
|
||||
|
||||
save() {
|
||||
var blob = new Blob([angular.toJson(this.dash, true)], { type: "application/json;charset=utf-8" });
|
||||
var wnd: any = window;
|
||||
wnd.saveAs(blob, this.dash.title + '-' + new Date().getTime() + '.json');
|
||||
var blob = new Blob([angular.toJson(this.dash, true)], {type: 'application/json;charset=utf-8'});
|
||||
saveAs(blob, this.dash.title + '-' + new Date().getTime() + '.json');
|
||||
}
|
||||
|
||||
saveJson() {
|
||||
@@ -44,7 +42,7 @@ export function dashExportDirective() {
|
||||
controller: DashExportCtrl,
|
||||
bindToController: true,
|
||||
controllerAs: 'ctrl',
|
||||
scope: {dismiss: "&"}
|
||||
scope: {dismiss: '&'},
|
||||
};
|
||||
}
|
||||
|
||||
|
||||
@@ -35,7 +35,6 @@ export class DashboardModel {
|
||||
gnetId: any;
|
||||
meta: any;
|
||||
events: any;
|
||||
editMode: boolean;
|
||||
|
||||
constructor(data, meta?) {
|
||||
if (!data) {
|
||||
|
||||
@@ -3,13 +3,10 @@
|
||||
<i class="fa fa-remove"></i>
|
||||
</a>
|
||||
|
||||
<div class="gf-form-inline dash-row-add-panel-form">
|
||||
<div class="gf-form">
|
||||
<input type="text" class="gf-form-input max-width-14" ng-model='ctrl.panelSearch' give-focus='true' ng-keydown="ctrl.keyDown($event)" ng-change="ctrl.panelSearchChanged()" placeholder="panel search filter"></input>
|
||||
</div>
|
||||
<div class="gf-form width-10">
|
||||
<input type="text" class="gf-form-input width-10" ng-model='ctrl.panelSearch' give-focus='true' ng-keydown="ctrl.keyDown($event)" ng-change="ctrl.panelSearchChanged()" placeholder="panel search filter"></input>
|
||||
</div>
|
||||
|
||||
<div class="add-panel-panels-scroll">
|
||||
<div class="add-panel-panels">
|
||||
<div class="add-panel-item"
|
||||
ng-repeat="panel in ctrl.panelHits"
|
||||
@@ -22,6 +19,5 @@
|
||||
<div class="add-panel-item-name">{{panel.name}}</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
</div>
|
||||
|
||||
@@ -49,7 +49,10 @@ export class SaveDashboardAsModalCtrl {
|
||||
if (dashboard.id > 0) {
|
||||
this.clone.rows.forEach(row => {
|
||||
row.panels.forEach(panel => {
|
||||
if (panel.type === "graph" && panel.alert) {
|
||||
delete panel.thresholds;
|
||||
}
|
||||
|
||||
delete panel.alert;
|
||||
});
|
||||
});
|
||||
|
||||
67
public/app/features/dashboard/specs/save_as_modal.jest.ts
Normal file
67
public/app/features/dashboard/specs/save_as_modal.jest.ts
Normal file
@@ -0,0 +1,67 @@
|
||||
import {SaveDashboardAsModalCtrl} from '../save_as_modal';
|
||||
import {describe, it, expect} from 'test/lib/common';
|
||||
|
||||
describe('saving dashboard as', () => {
|
||||
function scenario(name, panel, verify) {
|
||||
describe(name, () => {
|
||||
var json = {
|
||||
title: "name",
|
||||
rows: [ { panels: [
|
||||
panel
|
||||
]}]
|
||||
};
|
||||
|
||||
var mockDashboardSrv = {
|
||||
getCurrent: function() {
|
||||
return {
|
||||
id: 5,
|
||||
getSaveModelClone: function() {
|
||||
return json;
|
||||
}
|
||||
};
|
||||
}
|
||||
};
|
||||
|
||||
var ctrl = new SaveDashboardAsModalCtrl(mockDashboardSrv);
|
||||
var ctx: any = {
|
||||
clone: ctrl.clone,
|
||||
ctrl: ctrl,
|
||||
panel: {}
|
||||
};
|
||||
for (let row of ctrl.clone.rows) {
|
||||
for (let panel of row.panels) {
|
||||
ctx.panel = panel;
|
||||
}
|
||||
}
|
||||
it("verify", () => {
|
||||
verify(ctx);
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
scenario("default values", {}, (ctx) => {
|
||||
var clone = ctx.clone;
|
||||
expect(clone.id).toBe(null);
|
||||
expect(clone.title).toBe("name Copy");
|
||||
expect(clone.editable).toBe(true);
|
||||
expect(clone.hideControls).toBe(false);
|
||||
});
|
||||
|
||||
var graphPanel = { id: 1, type: "graph", alert: { rule: 1}, thresholds: { value: 3000} };
|
||||
|
||||
scenario("should remove alert from graph panel", graphPanel , (ctx) => {
|
||||
expect(ctx.panel.alert).toBe(undefined);
|
||||
});
|
||||
|
||||
scenario("should remove threshold from graph panel", graphPanel, (ctx) => {
|
||||
expect(ctx.panel.thresholds).toBe(undefined);
|
||||
});
|
||||
|
||||
scenario("singlestat should keep threshold", { id: 1, type: "singlestat", thresholds: { value: 3000} }, (ctx) => {
|
||||
expect(ctx.panel.thresholds).not.toBe(undefined);
|
||||
});
|
||||
|
||||
scenario("table should keep threshold", { id: 1, type: "table", thresholds: { value: 3000} }, (ctx) => {
|
||||
expect(ctx.panel.thresholds).not.toBe(undefined);
|
||||
});
|
||||
});
|
||||
@@ -73,7 +73,6 @@ function(angular, _) {
|
||||
dash.time = 0;
|
||||
dash.refresh = 0;
|
||||
dash.schemaVersion = 0;
|
||||
dash.editMode = false;
|
||||
|
||||
// filter row and panels properties that should be ignored
|
||||
dash.rows = _.filter(dash.rows, function(row) {
|
||||
|
||||
@@ -154,7 +154,6 @@ function (angular, _, $, config) {
|
||||
|
||||
ctrl.editMode = false;
|
||||
ctrl.fullscreen = false;
|
||||
ctrl.dashboard.editMode = this.oldDashboardEditMode;
|
||||
|
||||
this.$scope.appEvent('panel-fullscreen-exit', {panelId: ctrl.panel.id});
|
||||
|
||||
@@ -176,10 +175,8 @@ function (angular, _, $, config) {
|
||||
ctrl.editMode = this.state.edit && this.dashboard.meta.canEdit;
|
||||
ctrl.fullscreen = true;
|
||||
|
||||
this.oldDashboardEditMode = this.dashboard.editMode;
|
||||
this.oldTimeRange = ctrl.range;
|
||||
this.fullscreenPanel = panelScope;
|
||||
this.dashboard.editMode = false;
|
||||
|
||||
$(window).scrollTop(0);
|
||||
|
||||
|
||||
@@ -9,11 +9,23 @@ import config from 'app/core/config';
|
||||
import TimeSeries from 'app/core/time_series2';
|
||||
import TableModel from 'app/core/table_model';
|
||||
import {coreModule, appEvents, contextSrv} from 'app/core/core';
|
||||
import * as datemath from 'app/core/utils/datemath';
|
||||
import * as fileExport from 'app/core/utils/file_export';
|
||||
import * as flatten from 'app/core/utils/flatten';
|
||||
import * as ticks from 'app/core/utils/ticks';
|
||||
import {impressions} from 'app/features/dashboard/impression_store';
|
||||
import builtInPlugins from './built_in_plugins';
|
||||
import d3 from 'vendor/d3/d3';
|
||||
|
||||
// rxjs
|
||||
import {Observable} from 'rxjs/Observable';
|
||||
import {Subject} from 'rxjs/Subject';
|
||||
import * as datemath from 'app/core/utils/datemath';
|
||||
import builtInPlugins from './buit_in_plugins';
|
||||
import d3 from 'vendor/d3/d3';
|
||||
|
||||
// these imports add functions to Observable
|
||||
import 'rxjs/add/observable/empty';
|
||||
import 'rxjs/add/observable/from';
|
||||
import 'rxjs/add/operator/map';
|
||||
import 'rxjs/add/operator/combineAll';
|
||||
|
||||
System.config({
|
||||
baseURL: 'public',
|
||||
@@ -27,6 +39,12 @@ System.config({
|
||||
text: 'vendor/plugin-text/text.js',
|
||||
css: 'vendor/plugin-css/css.js'
|
||||
},
|
||||
meta: {
|
||||
'*': {
|
||||
esModule: true,
|
||||
authorization: true,
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
// add cache busting
|
||||
@@ -49,24 +67,39 @@ exposeToPlugin('lodash', _);
|
||||
exposeToPlugin('moment', moment);
|
||||
exposeToPlugin('jquery', jquery);
|
||||
exposeToPlugin('angular', angular);
|
||||
exposeToPlugin('d3', d3);
|
||||
exposeToPlugin('rxjs/Subject', Subject);
|
||||
exposeToPlugin('rxjs/Observable', Observable);
|
||||
exposeToPlugin('d3', d3);
|
||||
|
||||
// backward compatible path
|
||||
exposeToPlugin('vendor/npm/rxjs/Rx', {
|
||||
Subject: Subject,
|
||||
Observable: Observable
|
||||
});
|
||||
|
||||
exposeToPlugin('app/features/dashboard/impression_store', {
|
||||
impressions: impressions,
|
||||
__esModule: true
|
||||
});
|
||||
|
||||
exposeToPlugin('app/plugins/sdk', sdk);
|
||||
exposeToPlugin('app/core/utils/datemath', datemath);
|
||||
exposeToPlugin('app/core/utils/file_export', fileExport);
|
||||
exposeToPlugin('app/core/utils/flatten', flatten);
|
||||
exposeToPlugin('app/core/utils/kbn', kbn);
|
||||
exposeToPlugin('app/core/utils/ticks', ticks);
|
||||
|
||||
exposeToPlugin('app/core/config', config);
|
||||
exposeToPlugin('app/core/time_series', TimeSeries);
|
||||
exposeToPlugin('app/core/time_series2', TimeSeries);
|
||||
exposeToPlugin('app/core/table_model', TableModel);
|
||||
exposeToPlugin('app/core/app_events', appEvents);
|
||||
exposeToPlugin('app/core/core_module', coreModule);
|
||||
exposeToPlugin('app/core/core_module', coreModule);
|
||||
exposeToPlugin('app/core/core', {
|
||||
coreModule: coreModule,
|
||||
appEvents: appEvents,
|
||||
contextSrv: contextSrv,
|
||||
__esModule: true
|
||||
});
|
||||
|
||||
import 'vendor/flot/jquery.flot';
|
||||
@@ -79,7 +112,11 @@ import 'vendor/flot/jquery.flot.fillbelow';
|
||||
import 'vendor/flot/jquery.flot.crosshair';
|
||||
import 'vendor/flot/jquery.flot.dashes';
|
||||
|
||||
for (let flotDep of ['jquery.flot', 'jquery.flot.pie', 'jquery.flot.time']) {
|
||||
const flotDeps = [
|
||||
'jquery.flot', 'jquery.flot.pie', 'jquery.flot.time', 'jquery.flot.fillbelow', 'jquery.flot.crosshair',
|
||||
'jquery.flot.stack', 'jquery.flot.selection', 'jquery.flot.stackpercent', 'jquery.flot.events'
|
||||
];
|
||||
for (let flotDep of flotDeps) {
|
||||
exposeToPlugin(flotDep, {fakeDep: 1});
|
||||
}
|
||||
|
||||
|
||||
@@ -41,7 +41,7 @@ function (angular, _, moment, dateMath, kbn, templatingVariable) {
|
||||
item.namespace = templateSrv.replace(item.namespace, options.scopedVars);
|
||||
item.metricName = templateSrv.replace(item.metricName, options.scopedVars);
|
||||
item.dimensions = self.convertDimensionFormat(item.dimensions, options.scopeVars);
|
||||
item.period = self.getPeriod(item, options);
|
||||
item.period = String(self.getPeriod(item, options)); // use string format for period in graph query, and alerting
|
||||
|
||||
return _.extend({
|
||||
refId: item.refId,
|
||||
@@ -318,6 +318,8 @@ function (angular, _, moment, dateMath, kbn, templatingVariable) {
|
||||
|
||||
return this.getDimensionValues(region, namespace, metricName, 'ServiceName', dimensions).then(function () {
|
||||
return { status: 'success', message: 'Data source is working' };
|
||||
}, function (err) {
|
||||
return { status: 'error', message: err.message };
|
||||
});
|
||||
};
|
||||
|
||||
@@ -352,6 +354,7 @@ function (angular, _, moment, dateMath, kbn, templatingVariable) {
|
||||
var t = angular.copy(target);
|
||||
var scopedVar = {};
|
||||
scopedVar[variable.name] = v;
|
||||
t.refId = target.refId + '_' + v.value;
|
||||
t.dimensions[dimensionKey] = templateSrv.replace(t.dimensions[dimensionKey], scopedVar);
|
||||
return t;
|
||||
}).value();
|
||||
|
||||
@@ -37,7 +37,7 @@ describe('CloudWatchDatasource', function() {
|
||||
InstanceId: 'i-12345678'
|
||||
},
|
||||
statistics: ['Average'],
|
||||
period: 300
|
||||
period: '300'
|
||||
}
|
||||
]
|
||||
};
|
||||
@@ -109,7 +109,7 @@ describe('CloudWatchDatasource', function() {
|
||||
|
||||
ctx.ds.query(query).then(function() {
|
||||
var params = requestParams.queries[0];
|
||||
expect(params.period).to.be(600);
|
||||
expect(params.period).to.be('600');
|
||||
done();
|
||||
});
|
||||
ctx.$rootScope.$apply();
|
||||
|
||||
@@ -75,6 +75,12 @@ export class ElasticDatasource {
|
||||
return this.request('POST', url, data).then(function(results) {
|
||||
results.data.$$config = results.config;
|
||||
return results.data;
|
||||
}).catch(err => {
|
||||
if (err.data && err.data.error) {
|
||||
throw {message: 'Elasticsearch error: ' + err.data.error.reason, error: err.data.error};
|
||||
}
|
||||
|
||||
throw err;
|
||||
});
|
||||
}
|
||||
|
||||
|
||||
@@ -20,6 +20,10 @@ export class PostgresDatasource {
|
||||
return '\'' + value + '\'';
|
||||
}
|
||||
|
||||
if (typeof value === 'number') {
|
||||
return value.toString();
|
||||
}
|
||||
|
||||
var quotedValues = _.map(value, function(val) {
|
||||
return '\'' + val + '\'';
|
||||
});
|
||||
|
||||
@@ -16,8 +16,8 @@ class PostgresConfigCtrl {
|
||||
|
||||
const defaultQuery = `SELECT
|
||||
extract(epoch from time_column) AS time,
|
||||
title_column as title,
|
||||
description_column as text
|
||||
text_column as text,
|
||||
tags_column as tags
|
||||
FROM
|
||||
metric_table
|
||||
WHERE
|
||||
|
||||
@@ -8,7 +8,7 @@ export class PromCompleter {
|
||||
labelNameCache: any;
|
||||
labelValueCache: any;
|
||||
|
||||
identifierRegexps = [/[\[\]a-zA-Z_0-9=]/];
|
||||
identifierRegexps = [/\[/, /[a-zA-Z0-9_:]/];
|
||||
|
||||
constructor(private datasource: PrometheusDatasource) {
|
||||
this.labelQueryCache = {};
|
||||
@@ -73,13 +73,15 @@ export class PromCompleter {
|
||||
});
|
||||
}
|
||||
|
||||
if (prefix === '[') {
|
||||
if (token.type === 'paren.lparen' && token.value === '[') {
|
||||
var vectors = [];
|
||||
for (let unit of ['s', 'm', 'h']) {
|
||||
for (let value of [1,5,10,30]) {
|
||||
vectors.push({caption: value+unit, value: '['+value+unit, meta: 'range vector'});
|
||||
}
|
||||
}
|
||||
vectors.push({caption: '$__interval', value: '[$__interval', meta: 'range vector'});
|
||||
vectors.push({caption: '$__interval_ms', value: '[$__interval_ms', meta: 'range vector'});
|
||||
callback(null, vectors);
|
||||
return;
|
||||
}
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -1,5 +1,3 @@
|
||||
///<reference path="../../../headers/common.d.ts" />
|
||||
|
||||
import _ from 'lodash';
|
||||
|
||||
import kbn from 'app/core/utils/kbn';
|
||||
@@ -21,6 +19,7 @@ export class PrometheusDatasource {
|
||||
basicAuth: any;
|
||||
withCredentials: any;
|
||||
metricsNameCache: any;
|
||||
interval: string;
|
||||
|
||||
/** @ngInject */
|
||||
constructor(instanceSettings,
|
||||
@@ -36,6 +35,7 @@ export class PrometheusDatasource {
|
||||
this.directUrl = instanceSettings.directUrl;
|
||||
this.basicAuth = instanceSettings.basicAuth;
|
||||
this.withCredentials = instanceSettings.withCredentials;
|
||||
this.interval = instanceSettings.jsonData.timeInterval || '15s';
|
||||
}
|
||||
|
||||
_request(method, url, requestId?) {
|
||||
@@ -122,7 +122,7 @@ export class PrometheusDatasource {
|
||||
} else {
|
||||
for (let metricData of response.data.data.result) {
|
||||
if (response.data.data.resultType === 'matrix') {
|
||||
result.push(self.transformMetricData(metricData, activeTargets[index], start, end));
|
||||
result.push(self.transformMetricData(metricData, activeTargets[index], start, end, queries[index].step));
|
||||
} else if (response.data.data.resultType === 'vector') {
|
||||
result.push(self.transformInstantMetricData(metricData, activeTargets[index]));
|
||||
}
|
||||
@@ -144,7 +144,6 @@ export class PrometheusDatasource {
|
||||
var intervalFactor = target.intervalFactor || 1;
|
||||
// Adjust the interval to take into account any specified minimum and interval factor plus Prometheus limits
|
||||
var adjustedInterval = this.adjustInterval(interval, minInterval, range, intervalFactor);
|
||||
|
||||
var scopedVars = options.scopedVars;
|
||||
// If the interval was adjusted, make a shallow copy of scopedVars with updated interval vars
|
||||
if (interval !== adjustedInterval) {
|
||||
@@ -154,7 +153,7 @@ export class PrometheusDatasource {
|
||||
"__interval_ms": {text: interval * 1000, value: interval * 1000},
|
||||
});
|
||||
}
|
||||
target.step = query.step = interval;
|
||||
query.step = interval;
|
||||
|
||||
// Only replace vars in expression after having (possibly) updated interval vars
|
||||
query.expr = this.templateSrv.replace(target.expr, scopedVars, this.interpolateQueryExpr);
|
||||
@@ -168,7 +167,7 @@ export class PrometheusDatasource {
|
||||
if (interval !== 0 && range / intervalFactor / interval > 11000) {
|
||||
interval = Math.ceil(range / intervalFactor / 11000);
|
||||
}
|
||||
return Math.max(interval * intervalFactor, minInterval);
|
||||
return Math.max(interval * intervalFactor, minInterval, 1);
|
||||
}
|
||||
|
||||
performTimeSeriesQuery(query, start, end) {
|
||||
@@ -272,13 +271,13 @@ export class PrometheusDatasource {
|
||||
});
|
||||
}
|
||||
|
||||
transformMetricData(md, options, start, end) {
|
||||
transformMetricData(md, options, start, end, step) {
|
||||
var dps = [],
|
||||
metricLabel = null;
|
||||
|
||||
metricLabel = this.createMetricLabel(md.metric, options);
|
||||
|
||||
var stepMs = parseInt(options.step) * 1000;
|
||||
var stepMs = step * 1000;
|
||||
var baseTimestamp = start * 1000;
|
||||
for (let value of md.values) {
|
||||
var dp_value = parseFloat(value[1]);
|
||||
|
||||
@@ -1,3 +1,16 @@
|
||||
<datasource-http-settings current="ctrl.current" suggest-url="http://localhost:9090">
|
||||
</datasource-http-settings>
|
||||
|
||||
<div class="gf-form-group">
|
||||
<div class="gf-form-inline">
|
||||
<div class="gf-form">
|
||||
<span class="gf-form-label">Scrape interval</span>
|
||||
<input type="text" class="gf-form-input width-6" ng-model="ctrl.current.jsonData.timeInterval" spellcheck='false' placeholder="15s"></input>
|
||||
<info-popover mode="right-absolute">
|
||||
Set this to your global scrape interval defined in your Prometheus config file. This will be used as a lower limit for
|
||||
the Prometheus step query parameter.
|
||||
</info-popover>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
|
||||
@@ -4,7 +4,8 @@
|
||||
"id": "prometheus",
|
||||
|
||||
"includes": [
|
||||
{"type": "dashboard", "name": "Prometheus Stats", "path": "dashboards/prometheus_stats.json"}
|
||||
{"type": "dashboard", "name": "Prometheus Stats", "path": "dashboards/prometheus_stats.json"},
|
||||
{"type": "dashboard", "name": "Grafana Stats", "path": "dashboards/grafana_stats.json"}
|
||||
],
|
||||
|
||||
"metrics": true,
|
||||
|
||||
@@ -44,12 +44,18 @@ describe('Prometheus editor completer', function() {
|
||||
describe('When inside brackets', () => {
|
||||
it('Should return range vectors', () => {
|
||||
const session = getSessionStub({
|
||||
currentToken: {},
|
||||
tokens: [],
|
||||
line: '',
|
||||
currentToken: {type: 'paren.lparen', value: '[', index: 2, start: 9},
|
||||
tokens: [
|
||||
{type: 'identifier', value: 'node_cpu'},
|
||||
{type: 'paren.lparen', value: '['}
|
||||
],
|
||||
line: 'node_cpu[',
|
||||
});
|
||||
completer.getCompletions(editor, session, {row: 0, column: 10}, '[', (s, res) => {
|
||||
expect(res[0]).to.eql({caption: '1s', value: '[1s', meta: 'range vector'});
|
||||
|
||||
return completer.getCompletions(editor, session, {row: 0, column: 10}, '[', (s, res) => {
|
||||
expect(res[0].caption).to.eql('1s');
|
||||
expect(res[0].value).to.eql('[1s');
|
||||
expect(res[0].meta).to.eql('range vector');
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
@@ -5,7 +5,7 @@ import {PrometheusDatasource} from '../datasource';
|
||||
|
||||
describe('PrometheusDatasource', function() {
|
||||
var ctx = new helpers.ServiceTestContext();
|
||||
var instanceSettings = {url: 'proxied', directUrl: 'direct', user: 'test', password: 'mupp' };
|
||||
var instanceSettings = {url: 'proxied', directUrl: 'direct', user: 'test', password: 'mupp', jsonData: {}};
|
||||
|
||||
beforeEach(angularMocks.module('grafana.core'));
|
||||
beforeEach(angularMocks.module('grafana.services'));
|
||||
@@ -294,6 +294,20 @@ describe('PrometheusDatasource', function() {
|
||||
ctx.ds.query(query);
|
||||
ctx.$httpBackend.verifyNoOutstandingExpectation();
|
||||
});
|
||||
|
||||
it('step should never go below 1', function() {
|
||||
var query = {
|
||||
// 6 hour range
|
||||
range: { from: moment(1508318768202), to: moment(1508318770118) },
|
||||
targets: [{expr: 'test'}],
|
||||
interval: '100ms'
|
||||
};
|
||||
var urlExpected = 'proxied/api/v1/query_range?query=test&start=1508318769&end=1508318771&step=1';
|
||||
ctx.$httpBackend.expect('GET', urlExpected).respond(response);
|
||||
ctx.ds.query(query);
|
||||
ctx.$httpBackend.verifyNoOutstandingExpectation();
|
||||
});
|
||||
|
||||
it('should be auto interval when greater than min interval', function() {
|
||||
var query = {
|
||||
// 6 hour range
|
||||
|
||||
@@ -8,7 +8,7 @@ import PrometheusMetricFindQuery from '../metric_find_query';
|
||||
describe('PrometheusMetricFindQuery', function() {
|
||||
|
||||
var ctx = new helpers.ServiceTestContext();
|
||||
var instanceSettings = {url: 'proxied', directUrl: 'direct', user: 'test', password: 'mupp' };
|
||||
var instanceSettings = {url: 'proxied', directUrl: 'direct', user: 'test', password: 'mupp', jsonData: {}};
|
||||
|
||||
beforeEach(angularMocks.module('grafana.core'));
|
||||
beforeEach(angularMocks.module('grafana.services'));
|
||||
|
||||
@@ -34,7 +34,7 @@ class GettingStartedPanelCtrl extends PanelCtrl {
|
||||
check: () => {
|
||||
return $q.when(
|
||||
datasourceSrv.getMetricSources().filter(item => {
|
||||
return item.meta.builtIn === false;
|
||||
return item.meta.builtIn !== true;
|
||||
}).length > 0
|
||||
);
|
||||
}
|
||||
|
||||
@@ -29,7 +29,7 @@ define([
|
||||
$scope.setOverride = function(item, subItem) {
|
||||
// handle color overrides
|
||||
if (item.propertyName === 'color') {
|
||||
$scope.openColorSelector();
|
||||
$scope.openColorSelector($scope.override['color']);
|
||||
return;
|
||||
}
|
||||
|
||||
@@ -52,15 +52,17 @@ define([
|
||||
$scope.ctrl.render();
|
||||
};
|
||||
|
||||
$scope.openColorSelector = function() {
|
||||
$scope.openColorSelector = function(color) {
|
||||
var fakeSeries = {color: color};
|
||||
popoverSrv.show({
|
||||
element: $element.find(".dropdown")[0],
|
||||
position: 'top center',
|
||||
openOn: 'click',
|
||||
template: '<series-color-picker onColorChange="colorSelected" />',
|
||||
template: '<series-color-picker series="series" onColorChange="colorSelected" />',
|
||||
model: {
|
||||
autoClose: true,
|
||||
colorSelected: $scope.colorSelected,
|
||||
series: fakeSeries
|
||||
},
|
||||
onClose: function() {
|
||||
$scope.ctrl.render();
|
||||
|
||||
@@ -152,7 +152,7 @@ function drawLegendValues(elem, colorScale, rangeFrom, rangeTo, maxValue, minVal
|
||||
.tickSize(2);
|
||||
|
||||
let colorRect = legendElem.find(":first-child");
|
||||
let posY = colorRect.height() + 2;
|
||||
let posY = getSvgElemHeight(legendElem) + 2;
|
||||
let posX = getSvgElemX(colorRect);
|
||||
|
||||
d3.select(legendElem.get(0)).append("g")
|
||||
@@ -256,7 +256,16 @@ function getOpacityScale(options, maxValue, minValue = 0) {
|
||||
function getSvgElemX(elem) {
|
||||
let svgElem = elem.get(0);
|
||||
if (svgElem && svgElem.x && svgElem.x.baseVal) {
|
||||
return elem.get(0).x.baseVal.value;
|
||||
return svgElem.x.baseVal.value;
|
||||
} else {
|
||||
return 0;
|
||||
}
|
||||
}
|
||||
|
||||
function getSvgElemHeight(elem) {
|
||||
let svgElem = elem.get(0);
|
||||
if (svgElem && svgElem.height && svgElem.height.baseVal) {
|
||||
return svgElem.height.baseVal.value;
|
||||
} else {
|
||||
return 0;
|
||||
}
|
||||
|
||||
@@ -37,7 +37,16 @@ function elasticHistogramToHeatmap(seriesList) {
|
||||
bucket = heatmap[time] = {x: time, buckets: {}};
|
||||
}
|
||||
|
||||
bucket.buckets[bound] = {y: bound, count: count, values: [], points: []};
|
||||
bucket.buckets[bound] = {
|
||||
y: bound,
|
||||
count: count,
|
||||
bounds: {
|
||||
top: null,
|
||||
bottom: bound
|
||||
},
|
||||
values: [],
|
||||
points: []
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -83,7 +83,7 @@ export class HeatmapTooltip {
|
||||
let xData = data.buckets[xBucketIndex];
|
||||
// Search in special 'zero' bucket also
|
||||
let yData = _.find(xData.buckets, (bucket, bucketIndex) => {
|
||||
return bucket.bounds.bottom === yBucketIndex || bucketIndex === yBucketIndex;
|
||||
return bucket.bounds.bottom === yBucketIndex || bucketIndex === yBucketIndex.toString();
|
||||
});
|
||||
|
||||
let tooltipTimeFormat = 'YYYY-MM-DD HH:mm:ss';
|
||||
@@ -168,7 +168,8 @@ export class HeatmapTooltip {
|
||||
let yBucketSize = this.scope.ctrl.data.yBucketSize;
|
||||
let {min, max, ticks} = this.scope.ctrl.data.yAxis;
|
||||
let histogramData = _.map(xBucket.buckets, bucket => {
|
||||
return [bucket.bounds.bottom, bucket.values.length];
|
||||
let count = bucket.count !== undefined ? bucket.count : bucket.values.length;
|
||||
return [bucket.bounds.bottom, count];
|
||||
});
|
||||
histogramData = _.filter(histogramData, d => {
|
||||
return d[0] >= min && d[0] <= max;
|
||||
|
||||
@@ -71,9 +71,8 @@ export default function link(scope, elem, attrs, ctrl) {
|
||||
function getYAxisWidth(elem) {
|
||||
let axis_text = elem.selectAll(".axis-y text").nodes();
|
||||
let max_text_width = _.max(_.map(axis_text, text => {
|
||||
let el = $(text);
|
||||
// Use JQuery outerWidth() to compute full element width
|
||||
return el.outerWidth();
|
||||
// Use SVG getBBox method
|
||||
return text.getBBox().width;
|
||||
}));
|
||||
|
||||
return max_text_width;
|
||||
|
||||
@@ -224,17 +224,17 @@ describe('ES Histogram converter', () => {
|
||||
'1422774000000': {
|
||||
x: 1422774000000,
|
||||
buckets: {
|
||||
'1': { y: 1, count: 1, values: [], points: [] },
|
||||
'2': { y: 2, count: 5, values: [], points: [] },
|
||||
'3': { y: 3, count: 0, values: [], points: [] }
|
||||
'1': { y: 1, count: 1, values: [], points: [], bounds: {bottom: 1, top: null}},
|
||||
'2': { y: 2, count: 5, values: [], points: [], bounds: {bottom: 2, top: null}},
|
||||
'3': { y: 3, count: 0, values: [], points: [], bounds: {bottom: 3, top: null}}
|
||||
}
|
||||
},
|
||||
'1422774060000': {
|
||||
x: 1422774060000,
|
||||
buckets: {
|
||||
'1': { y: 1, count: 0, values: [], points: [] },
|
||||
'2': { y: 2, count: 3, values: [], points: [] },
|
||||
'3': { y: 3, count: 1, values: [], points: [] }
|
||||
'1': { y: 1, count: 0, values: [], points: [], bounds: {bottom: 1, top: null}},
|
||||
'2': { y: 2, count: 3, values: [], points: [], bounds: {bottom: 2, top: null}},
|
||||
'3': { y: 3, count: 1, values: [], points: [], bounds: {bottom: 3, top: null}}
|
||||
}
|
||||
},
|
||||
};
|
||||
|
||||
@@ -66,7 +66,7 @@ class SingleStatCtrl extends MetricsPanelCtrl {
|
||||
thresholds: '',
|
||||
colorBackground: false,
|
||||
colorValue: false,
|
||||
colors: ["rgba(245, 54, 54, 0.9)", "rgba(237, 129, 40, 0.89)", "rgba(50, 172, 45, 0.97)"],
|
||||
colors: ["#299c46", "rgba(237, 129, 40, 0.89)", "#d44a3a"],
|
||||
sparkline: {
|
||||
show: false,
|
||||
full: false,
|
||||
|
||||
@@ -27,8 +27,8 @@ $white: #fff;
|
||||
// -------------------------
|
||||
$blue: #33B5E5;
|
||||
$blue-dark: #005f81;
|
||||
$green: #609000;
|
||||
$red: #CC3900;
|
||||
$green: #299c46;
|
||||
$red: #d44a3a;
|
||||
$yellow: #ECBB13;
|
||||
$pink: #FF4444;
|
||||
$purple: #9933CC;
|
||||
@@ -130,20 +130,20 @@ $table-border: $dark-3; // table and cell border
|
||||
// Buttons
|
||||
// -------------------------
|
||||
|
||||
$btn-primary-bg: $brand-primary;
|
||||
$btn-primary-bg-hl: lighten($brand-primary, 8%);
|
||||
$btn-primary-bg: #ff6600;
|
||||
$btn-primary-bg-hl: #bc3e06;
|
||||
|
||||
$btn-secondary-bg: $blue-dark;
|
||||
$btn-secondary-bg-hl: lighten($blue-dark, 5%);
|
||||
|
||||
$btn-success-bg: lighten($green, 3%);
|
||||
$btn-success-bg-hl: darken($green, 3%);
|
||||
$btn-success-bg: $green;
|
||||
$btn-success-bg-hl: darken($green, 6%);
|
||||
|
||||
$btn-warning-bg: $brand-warning;
|
||||
$btn-warning-bg-hl: lighten($brand-warning, 8%);
|
||||
|
||||
$btn-danger-bg: $red;
|
||||
$btn-danger-bg-hl: lighten($red, 5%);
|
||||
$btn-danger-bg-hl: darken($red, 8%);
|
||||
|
||||
$btn-inverse-bg: $dark-3;
|
||||
$btn-inverse-bg-hl: lighten($dark-3, 4%);
|
||||
|
||||
@@ -32,8 +32,8 @@ $white: #fff;
|
||||
// -------------------------
|
||||
$blue: #2AB2E4;
|
||||
$blue-dark: #3CAAD6;
|
||||
$green: #28B62C;
|
||||
$red: #FF4136;
|
||||
$green: #3aa655;
|
||||
$red: #d44939;
|
||||
$yellow: #FF851B;
|
||||
$orange: #Ff7941;
|
||||
$pink: #E671B8;
|
||||
|
||||
@@ -121,6 +121,7 @@ $gf-form-margin: 0.25rem;
|
||||
// text areas should be scrollable
|
||||
@at-root textarea#{&} {
|
||||
overflow: auto;
|
||||
white-space: pre-wrap;
|
||||
}
|
||||
|
||||
// Unstyle the caret on `<select>`s in IE10+.
|
||||
|
||||
@@ -211,15 +211,6 @@
|
||||
margin-right: 0px;
|
||||
line-height: initial;
|
||||
}
|
||||
.close {
|
||||
margin-right: 5px;
|
||||
color: $link-color;
|
||||
opacity: 0.7;
|
||||
text-shadow: none;
|
||||
}
|
||||
.editor-row {
|
||||
padding: 5px;
|
||||
}
|
||||
}
|
||||
|
||||
.annotation-tags {
|
||||
|
||||
@@ -61,6 +61,7 @@
|
||||
margin: 0 $panel-margin $panel-margin*2 $panel-margin;
|
||||
padding: $panel-margin*2;
|
||||
display: flex;
|
||||
flex-direction: row;
|
||||
}
|
||||
|
||||
.dash-row-dropview-close {
|
||||
@@ -71,19 +72,10 @@
|
||||
height: 20px;
|
||||
}
|
||||
|
||||
.add-panel-panels-scroll {
|
||||
width: 100%;
|
||||
overflow: auto;
|
||||
-ms-overflow-style: none;
|
||||
|
||||
&::-webkit-scrollbar {
|
||||
display: none
|
||||
}
|
||||
}
|
||||
|
||||
.add-panel-panels {
|
||||
display: flex;
|
||||
flex-direction: row;
|
||||
flex-wrap: wrap;
|
||||
}
|
||||
|
||||
.add-panel-item {
|
||||
|
||||
7
public/vendor/flot/jquery.flot.js
vendored
7
public/vendor/flot/jquery.flot.js
vendored
@@ -2962,8 +2962,11 @@ Licensed under the MIT license.
|
||||
}
|
||||
|
||||
function onClick(e) {
|
||||
triggerClickHoverEvent("plotclick", e,
|
||||
function (s) { return s["clickable"] != false; });
|
||||
if (plot.isSelecting) {
|
||||
return;
|
||||
}
|
||||
|
||||
triggerClickHoverEvent("plotclick", e, function (s) { return s["clickable"] != false; });
|
||||
}
|
||||
|
||||
// trigger click or hover event (they send the same parameters
|
||||
|
||||
5
public/vendor/flot/jquery.flot.selection.js
vendored
5
public/vendor/flot/jquery.flot.selection.js
vendored
@@ -152,6 +152,10 @@ The plugin allso adds the following methods to the plot object:
|
||||
plot.getPlaceholder().trigger("plotselecting", [ null ]);
|
||||
}
|
||||
|
||||
setTimeout(function() {
|
||||
plot.isSelecting = false;
|
||||
}, 10);
|
||||
|
||||
return false;
|
||||
}
|
||||
|
||||
@@ -218,6 +222,7 @@ The plugin allso adds the following methods to the plot object:
|
||||
|
||||
setSelectionPos(selection.second, pos);
|
||||
if (selectionIsSane()) {
|
||||
plot.isSelecting = true;
|
||||
selection.show = true;
|
||||
plot.triggerRedrawOverlay();
|
||||
}
|
||||
|
||||
@@ -21,7 +21,7 @@ RUN gpg --keyserver hkp://keys.gnupg.net --recv-keys 409B6B1796C275462A170311380
|
||||
RUN curl --silent --location https://rpm.nodesource.com/setup_6.x | bash - && \
|
||||
yum install -y nodejs --nogpgcheck
|
||||
|
||||
ENV GOLANG_VERSION 1.9.1
|
||||
ENV GOLANG_VERSION 1.9.2
|
||||
|
||||
RUN wget https://dl.yarnpkg.com/rpm/yarn.repo -O /etc/yum.repos.d/yarn.repo && \
|
||||
yum install -y yarn --nogpgcheck && \
|
||||
|
||||
Reference in New Issue
Block a user