Compare commits

...

47 Commits

Author SHA1 Message Date
Marcus Efraimsson
0bbac5cade Merge pull request #13193 from grafana/cp-5.2.4
Cherry picks v5.2.4
2018-09-07 16:35:16 +02:00
Marcus Efraimsson
e385bc141b release 5.2.4 2018-09-07 16:32:49 +02:00
Carl Bergquist
c53d7ad47c cli: avoid rely on response.ContentLength (#13120)
response.ContentLength might be invalid if the http response
is chunked.

fixes #13079
(cherry picked from commit ee1083d9b4)
2018-09-07 16:30:49 +02:00
Leonard Gram
cf4c090fe2 Version updated to 5.2.3. 2018-08-22 10:20:39 +02:00
Torkel Ödegaard
1440e77bea fixed test in cherry picked commit 2018-08-21 14:01:52 +02:00
Torkel Ödegaard
df83bf10a2 sql: added code migration type
(cherry picked from commit 92ed1f04af)
2018-08-21 13:56:08 +02:00
Marcus Efraimsson
aeaf7b23a6 Merge pull request #12712 from grafana/cp-5.2.2
Cherry-picks for 5.2.2
2018-07-25 13:17:28 +02:00
Marcus Efraimsson
0ff4aa80ed remove unnecessary conversions
(cherry picked from commit d2f31a716f)
2018-07-25 12:27:42 +02:00
Marcus Efraimsson
020ecfdf56 release 5.2.2 2018-07-25 12:17:12 +02:00
Torkel Ödegaard
e70a9de45a fix: postgres/mysql engine cache was not being used, fixes #12636
(cherry picked from commit cd60234e7c)
2018-07-25 12:11:19 +02:00
Torkel Ödegaard
a6fc391583 fix: panel embedd scrolbar fix, fixes #12589
(cherry picked from commit 02ecf01bba)
2018-07-25 12:10:41 +02:00
Mitsuhiro Tanda
534ba6d805 (prometheus) prevent error to use $__interval_ms in query (#12533)
* prevent error to use $__interval_ms in query

* add test

* prevent error to use $__interval_ms in query

(cherry picked from commit 18a8290c65)
2018-07-25 12:08:42 +02:00
David Kaltschmidt
2bd1a19169 Fix freezing browser when loading plugin
- broken since 4d2dd2209
- `*` was previously working as a path matcher, but freezes browser when
  used with new cache-busting plugin loader
- changed matcher to be `/*`

(cherry picked from commit 81e62e1051)
2018-07-25 12:07:15 +02:00
David
a996dd25d6 Fix css loading in plugins (#12573)
- allow css loader to be imported again (wasnt prefixed by plugin)
(cherry picked from commit 4d2dd22095)
2018-07-25 12:06:36 +02:00
Marcus Efraimsson
4d108007fe fix links not updating after changing variables
(cherry picked from commit 5e4d6958d6)
2018-07-25 12:02:58 +02:00
David
4df07b4e25 Fix bar width issue in aligned prometheus queries (#12483)
* Fix bar width issue in aligned prometheus queries

This was broken because null values were filled in with unaligned times.

* use aligned times for result transformation
* add tests

An earlier version of this fix aligned the times again in the transformer, but
I think it's safe to only deal with aligned times in the response.

* Fixed prometheus heatmap tranformer test

The interval needs to be 1 to prevent step alignment.

(cherry picked from commit 0d1f7c8782)
2018-07-25 12:01:18 +02:00
Marcus Efraimsson
2040f61c56 Merge pull request #12458 from grafana/cp-5.2.1
Cherry-picks for 5.2.1
2018-06-29 11:17:46 +02:00
Marcus Efraimsson
aa94f7ebfe release 5.2.1 2018-06-29 10:58:00 +02:00
Torkel Ödegaard
855b570878 fix: log close/flush was done too early, before server shutdown log message was called, fixes #12438
(cherry picked from commit 7a7c6f8fab)
2018-06-29 10:56:00 +02:00
Marcus Efraimsson
1713f7f01d Revert "auth proxy: use real ip when validating white listed ip's"
(cherry picked from commit 8af5da7383)
2018-06-29 10:55:27 +02:00
Marcus Efraimsson
2b7d124be8 fix footer css issue
(cherry picked from commit 54420363d3)
2018-06-29 10:54:16 +02:00
Marcus Efraimsson
ad4d71740a Merge pull request #12415 from grafana/cp-5.2.0
Cherry-picks for v5.2.0 stable
2018-06-27 09:38:18 +02:00
Marcus Efraimsson
77312d3a9c release v5.2.0 2018-06-26 18:21:56 +02:00
Marcus Efraimsson
3565fe7105 login: fix layout issues
(cherry picked from commit 9f02927761)
2018-06-26 18:17:45 +02:00
Marcus Efraimsson
df62c6a197 set correct text in drop down when variable is present in url using key/values
(cherry picked from commit 5280084480)
2018-06-26 18:16:59 +02:00
Marcus Efraimsson
f929bd51db enhance error message if phantomjs executable is not found
if arm build, explain that phantomjs is not included by default in arm
builds. If not explain that phantom js isn't installed correctly

(cherry picked from commit f106de0efd)
2018-06-26 18:16:26 +02:00
rozetko
381f3da30e Set $rootScope in DatasourceSrv
(cherry picked from commit 97db9ece98)
2018-06-26 18:15:52 +02:00
Aleksei Magusev
f48ea5eea6 Fix ResponseParser for InfluxDB to return only string values
(cherry picked from commit b7482ae8b7)
2018-06-26 18:15:01 +02:00
Aleksei Magusev
3c2cb7715b Conditionally select a field to return in ResponseParser for InfluxDB
This patch also fixes "value[1] || value[0]" to not ignore zeros.

(cherry picked from commit e104e9b2c2)
2018-06-26 18:14:44 +02:00
Dan Cech
90132770fa handle "dn" ldap attribute more gracefully (#12385)
* handle "dn" ldap attribute more gracefully

* use strings.ToLower

(cherry picked from commit 583df47c2f)
2018-06-26 09:44:12 +02:00
Daniel Lee
30c882c18d Merge pull request #12364 from grafana/cp-5.2.0-beta3
Cherry-picks for v5.2.0-beta3
2018-06-21 11:11:12 +02:00
Marcus Efraimsson
e5836064ce release v5.2.0-beta3 2018-06-21 10:54:24 +02:00
Marcus Efraimsson
f7cb827944 build: fix signing of multiple rpm packages
(cherry picked from commit e617e23927)
2018-06-21 10:54:04 +02:00
Marcus Efraimsson
f76cafa68e Merge pull request #12357 from grafana/cp-5.2.0-beta2
Cherry-picks for v5.2.0-beta2
2018-06-20 15:21:50 +02:00
Marcus Efraimsson
cdae9126ed release v5.2.0-beta2 2018-06-20 14:58:30 +02:00
Marcus Efraimsson
b4c1df11f6 make sure to process panels in collapsed rows when exporting dashboard
(cherry picked from commit a2e08dc4e8)
2018-06-20 14:56:13 +02:00
Alexander Zobnin
7c94d5cd1a graph: fix legend decimals precision calculation
(cherry picked from commit 24f6d34abd)
2018-06-20 14:55:38 +02:00
Marcus Efraimsson
74d6b5fc1c dashboard: fix drop down links
(cherry picked from commit 4ef4a4d3a7)
2018-06-20 14:55:04 +02:00
Marcus Efraimsson
af42e0836a fix regressions after save modal changes of not storing time and variables per default
Fix problem with adhoc variable filters not handled.
Fix problem with saving variables and time per default when saving a
dashboard as/first time.
Fix updating dashboard model after save with saving time/variables
enabled so that next time you save you won't get checkboxes for save
time/variables unless any values changed.
Tests validating correctness if time/variable values has changed.

(cherry picked from commit 41ac8d4cd5)
2018-06-20 14:54:13 +02:00
Martin Molnar
f453fbe8ef feat(ldap): Allow use of DN in user attribute filter (#3132)
(cherry picked from commit be2fa54459)
2018-06-20 14:53:36 +02:00
Marcus Efraimsson
8d635efda0 snapshot: copy correct props when creating a snapshot
(cherry picked from commit a738347957)
2018-06-20 14:52:59 +02:00
Marcus Efraimsson
0f2e879339 set current org when adding/removing user to org
To not get into a situation where a user has a current organization assign which he is
not a member of we try to always make sure that a user has a valid current organization
assigned.

(cherry picked from commit 6d48d0a80c)
2018-06-20 14:52:11 +02:00
Marcus Efraimsson
e51dd88260 Merge pull request #12340 from grafana/apikey-permission-fix-cherry-pick2
v5.2.x cherry pick fix
2018-06-19 12:37:26 +02:00
Torkel Ödegaard
984293cc52 fix: fixed permission issue with api key with viewer role in dashboards with default permissions
(cherry picked from commit 24d0b43e62)
2018-06-19 11:14:33 +02:00
Marcus Efraimsson
9a1a9584b7 Merge pull request #12316 from grafana/v52_merge_master
Merge master to v5.2.x release branch
2018-06-18 11:50:48 +02:00
Marcus Efraimsson
8a69ffb007 Merge branch 'master' into v52_merge_master 2018-06-18 09:44:39 +02:00
Leonard Gram
faa5e699d2 Release v5.2.0-beta1. 2018-06-05 10:43:14 +02:00
58 changed files with 1013 additions and 425 deletions

View File

@@ -60,7 +60,8 @@ datasources:
url: localhost:5432
database: grafana
user: grafana
password: password
secureJsonData:
password: password
jsonData:
sslmode: "disable"
@@ -71,3 +72,4 @@ datasources:
authType: credentials
defaultRegion: eu-west-2

View File

@@ -1,4 +1,4 @@
{
"stable": "5.1.3",
"testing": "5.1.3"
"stable": "5.2.0",
"testing": "5.2.0"
}

View File

@@ -4,7 +4,7 @@
"company": "Grafana Labs"
},
"name": "grafana",
"version": "5.2.0-pre1",
"version": "5.2.4",
"repository": {
"type": "git",
"url": "http://github.com/grafana/grafana.git"

View File

@@ -78,7 +78,13 @@ func tryLoginUsingRememberCookie(c *m.ReqContext) bool {
user := userQuery.Result
// validate remember me cookie
if val, _ := c.GetSuperSecureCookie(user.Rands+user.Password, setting.CookieRememberName); val != user.Login {
signingKey := user.Rands + user.Password
if len(signingKey) < 10 {
c.Logger.Error("Invalid user signingKey")
return false
}
if val, _ := c.GetSuperSecureCookie(signingKey, setting.CookieRememberName); val != user.Login {
return false
}

View File

@@ -3,7 +3,9 @@ package api
import (
"fmt"
"net/http"
"runtime"
"strconv"
"strings"
"time"
m "github.com/grafana/grafana/pkg/models"
@@ -55,6 +57,15 @@ func (hs *HTTPServer) RenderToPng(c *m.ReqContext) {
return
}
if err != nil && err == rendering.ErrPhantomJSNotInstalled {
if strings.HasPrefix(runtime.GOARCH, "arm") {
c.Handle(500, "Rendering failed - PhantomJS isn't included in arm build per default", err)
} else {
c.Handle(500, "Rendering failed - PhantomJS isn't installed correctly", err)
}
return
}
if err != nil {
c.Handle(500, "Rendering failed.", err)
return

View File

@@ -152,7 +152,7 @@ func downloadFile(pluginName, filePath, url string) (err error) {
return err
}
r, err := zip.NewReader(bytes.NewReader(body), resp.ContentLength)
r, err := zip.NewReader(bytes.NewReader(body), int64(len(body)))
if err != nil {
return err
}

View File

@@ -14,7 +14,6 @@ import (
"net/http"
_ "net/http/pprof"
"github.com/grafana/grafana/pkg/log"
"github.com/grafana/grafana/pkg/metrics"
"github.com/grafana/grafana/pkg/setting"
@@ -88,9 +87,6 @@ func main() {
err := server.Run()
trace.Stop()
log.Close()
server.Exit(err)
}

View File

@@ -184,6 +184,8 @@ func (g *GrafanaServerImpl) Exit(reason error) {
}
g.log.Error("Server shutdown", "reason", reason)
log.Close()
os.Exit(code)
}

View File

@@ -308,6 +308,7 @@ func (a *ldapAuther) searchForUser(username string) (*LdapUserInfo, error) {
} else {
filter_replace = getLdapAttr(a.server.GroupSearchFilterUserAttribute, searchResult)
}
filter := strings.Replace(a.server.GroupSearchFilter, "%s", ldap.EscapeFilter(filter_replace), -1)
a.log.Info("Searching for user's groups", "filter", filter)
@@ -348,7 +349,7 @@ func (a *ldapAuther) searchForUser(username string) (*LdapUserInfo, error) {
}
func getLdapAttrN(name string, result *ldap.SearchResult, n int) string {
if name == "DN" {
if strings.ToLower(name) == "dn" {
return result.Entries[n].DN
}
for _, attr := range result.Entries[n].Attributes {

View File

@@ -2,6 +2,7 @@ package middleware
import (
"fmt"
"net"
"net/mail"
"reflect"
"strings"
@@ -28,7 +29,7 @@ func initContextWithAuthProxy(ctx *m.ReqContext, orgID int64) bool {
}
// if auth proxy ip(s) defined, check if request comes from one of those
if err := checkAuthenticationProxy(ctx.RemoteAddr(), proxyHeaderValue); err != nil {
if err := checkAuthenticationProxy(ctx.Req.RemoteAddr, proxyHeaderValue); err != nil {
ctx.Handle(407, "Proxy authentication required", err)
return true
}
@@ -196,23 +197,18 @@ func checkAuthenticationProxy(remoteAddr string, proxyHeaderValue string) error
return nil
}
// Multiple ip addresses? Right-most IP address is the IP address of the most recent proxy
if strings.Contains(remoteAddr, ",") {
sourceIPs := strings.Split(remoteAddr, ",")
remoteAddr = strings.TrimSpace(sourceIPs[len(sourceIPs)-1])
}
remoteAddr = strings.TrimPrefix(remoteAddr, "[")
remoteAddr = strings.TrimSuffix(remoteAddr, "]")
proxies := strings.Split(setting.AuthProxyWhitelist, ",")
sourceIP, _, err := net.SplitHostPort(remoteAddr)
if err != nil {
return err
}
// Compare allowed IP addresses to actual address
for _, proxyIP := range proxies {
if remoteAddr == strings.TrimSpace(proxyIP) {
if sourceIP == strings.TrimSpace(proxyIP) {
return nil
}
}
return fmt.Errorf("Request for user (%s) from %s is not from the authentication proxy", proxyHeaderValue, remoteAddr)
return fmt.Errorf("Request for user (%s) from %s is not from the authentication proxy", proxyHeaderValue, sourceIP)
}

View File

@@ -293,61 +293,6 @@ func TestMiddlewareContext(t *testing.T) {
})
})
middlewareScenario("When auth_proxy is enabled and request has X-Forwarded-For that is not trusted", func(sc *scenarioContext) {
setting.AuthProxyEnabled = true
setting.AuthProxyHeaderName = "X-WEBAUTH-USER"
setting.AuthProxyHeaderProperty = "username"
setting.AuthProxyWhitelist = "192.168.1.1, 2001::23"
bus.AddHandler("test", func(query *m.GetSignedInUserQuery) error {
query.Result = &m.SignedInUser{OrgId: 4, UserId: 33}
return nil
})
bus.AddHandler("test", func(cmd *m.UpsertUserCommand) error {
cmd.Result = &m.User{Id: 33}
return nil
})
sc.fakeReq("GET", "/")
sc.req.Header.Add("X-WEBAUTH-USER", "torkelo")
sc.req.Header.Add("X-Forwarded-For", "client-ip, 192.168.1.1, 192.168.1.2")
sc.exec()
Convey("should return 407 status code", func() {
So(sc.resp.Code, ShouldEqual, 407)
So(sc.resp.Body.String(), ShouldContainSubstring, "Request for user (torkelo) from 192.168.1.2 is not from the authentication proxy")
})
})
middlewareScenario("When auth_proxy is enabled and request has X-Forwarded-For that is trusted", func(sc *scenarioContext) {
setting.AuthProxyEnabled = true
setting.AuthProxyHeaderName = "X-WEBAUTH-USER"
setting.AuthProxyHeaderProperty = "username"
setting.AuthProxyWhitelist = "192.168.1.1, 2001::23"
bus.AddHandler("test", func(query *m.GetSignedInUserQuery) error {
query.Result = &m.SignedInUser{OrgId: 4, UserId: 33}
return nil
})
bus.AddHandler("test", func(cmd *m.UpsertUserCommand) error {
cmd.Result = &m.User{Id: 33}
return nil
})
sc.fakeReq("GET", "/")
sc.req.Header.Add("X-WEBAUTH-USER", "torkelo")
sc.req.Header.Add("X-Forwarded-For", "client-ip, 192.168.1.2, 192.168.1.1")
sc.exec()
Convey("Should init context with user info", func() {
So(sc.context.IsSignedIn, ShouldBeTrue)
So(sc.context.UserId, ShouldEqual, 33)
So(sc.context.OrgId, ShouldEqual, 4)
})
})
middlewareScenario("When session exists for previous user, create a new session", func(sc *scenarioContext) {
setting.AuthProxyEnabled = true
setting.AuthProxyHeaderName = "X-WEBAUTH-USER"

View File

@@ -50,7 +50,7 @@ func TestAlertRuleExtraction(t *testing.T) {
So(err, ShouldBeNil)
Convey("Extractor should not modify the original json", func() {
dashJson, err := simplejson.NewJson([]byte(json))
dashJson, err := simplejson.NewJson(json)
So(err, ShouldBeNil)
dash := m.NewDashboardFromJson(dashJson)
@@ -79,7 +79,7 @@ func TestAlertRuleExtraction(t *testing.T) {
Convey("Parsing and validating dashboard containing graphite alerts", func() {
dashJson, err := simplejson.NewJson([]byte(json))
dashJson, err := simplejson.NewJson(json)
So(err, ShouldBeNil)
dash := m.NewDashboardFromJson(dashJson)
@@ -143,7 +143,7 @@ func TestAlertRuleExtraction(t *testing.T) {
panelWithoutId, err := ioutil.ReadFile("./test-data/panels-missing-id.json")
So(err, ShouldBeNil)
dashJson, err := simplejson.NewJson([]byte(panelWithoutId))
dashJson, err := simplejson.NewJson(panelWithoutId)
So(err, ShouldBeNil)
dash := m.NewDashboardFromJson(dashJson)
extractor := NewDashAlertExtractor(dash, 1)
@@ -159,7 +159,7 @@ func TestAlertRuleExtraction(t *testing.T) {
panelWithIdZero, err := ioutil.ReadFile("./test-data/panel-with-id-0.json")
So(err, ShouldBeNil)
dashJson, err := simplejson.NewJson([]byte(panelWithIdZero))
dashJson, err := simplejson.NewJson(panelWithIdZero)
So(err, ShouldBeNil)
dash := m.NewDashboardFromJson(dashJson)
extractor := NewDashAlertExtractor(dash, 1)

View File

@@ -83,7 +83,7 @@ func (g *dashboardGuardianImpl) checkAcl(permission m.PermissionType, acl []*m.D
for _, p := range acl {
// user match
if !g.user.IsAnonymous {
if !g.user.IsAnonymous && p.UserId > 0 {
if p.UserId == g.user.UserId && p.Permission >= permission {
return true, nil
}

View File

@@ -162,6 +162,11 @@ func TestGuardianViewer(t *testing.T) {
sc.parentFolderPermissionScenario(VIEWER, m.PERMISSION_EDIT, EDITOR_ACCESS)
sc.parentFolderPermissionScenario(VIEWER, m.PERMISSION_VIEW, VIEWER_ACCESS)
})
apiKeyScenario("Given api key with viewer role", t, m.ROLE_VIEWER, func(sc *scenarioContext) {
// dashboard has default permissions
sc.defaultPermissionScenario(VIEWER, m.PERMISSION_EDIT, VIEWER_ACCESS)
})
})
}
@@ -267,7 +272,7 @@ func (sc *scenarioContext) verifyExpectedPermissionsFlags() {
actualFlag = NO_ACCESS
}
if sc.expectedFlags&actualFlag != sc.expectedFlags {
if actualFlag&sc.expectedFlags != actualFlag {
sc.reportFailure(tc, sc.expectedFlags.String(), actualFlag.String())
}

View File

@@ -48,6 +48,27 @@ func orgRoleScenario(desc string, t *testing.T, role m.RoleType, fn scenarioFunc
})
}
func apiKeyScenario(desc string, t *testing.T, role m.RoleType, fn scenarioFunc) {
user := &m.SignedInUser{
UserId: 0,
OrgId: orgID,
OrgRole: role,
ApiKeyId: 10,
}
guard := New(dashboardID, orgID, user)
sc := &scenarioContext{
t: t,
orgRoleScenario: desc,
givenUser: user,
givenDashboardID: dashboardID,
g: guard,
}
Convey(desc, func() {
fn(sc)
})
}
func permissionScenario(desc string, dashboardID int64, sc *scenarioContext, permissions []*m.DashboardAclInfoDTO, fn scenarioFunc) {
bus.ClearBusHandlers()

View File

@@ -10,6 +10,7 @@ import (
var ErrTimeout = errors.New("Timeout error. You can set timeout in seconds with &timeout url parameter")
var ErrNoRenderer = errors.New("No renderer plugin found nor is an external render server configured")
var ErrPhantomJSNotInstalled = errors.New("PhantomJS executable not found")
type Opts struct {
Width int

View File

@@ -24,6 +24,11 @@ func (rs *RenderingService) renderViaPhantomJS(ctx context.Context, opts Opts) (
url := rs.getURL(opts.Path)
binPath, _ := filepath.Abs(filepath.Join(rs.Cfg.PhantomDir, executable))
if _, err := os.Stat(binPath); os.IsNotExist(err) {
rs.log.Error("executable not found", "executable", binPath)
return nil, ErrPhantomJSNotInstalled
}
scriptPath, _ := filepath.Abs(filepath.Join(rs.Cfg.PhantomDir, "render.js"))
pngPath := rs.getFilePathForNewImage()

View File

@@ -1,6 +1,12 @@
package migrations
import . "github.com/grafana/grafana/pkg/services/sqlstore/migrator"
import (
"fmt"
"github.com/go-xorm/xorm"
. "github.com/grafana/grafana/pkg/services/sqlstore/migrator"
"github.com/grafana/grafana/pkg/util"
)
func addUserMigrations(mg *Migrator) {
userV1 := Table{
@@ -107,4 +113,37 @@ func addUserMigrations(mg *Migrator) {
mg.AddMigration("Add last_seen_at column to user", NewAddColumnMigration(userV2, &Column{
Name: "last_seen_at", Type: DB_DateTime, Nullable: true,
}))
// Adds salt & rands for old users who used ldap or oauth
mg.AddMigration("Add missing user data", &AddMissingUserSaltAndRandsMigration{})
}
type AddMissingUserSaltAndRandsMigration struct {
MigrationBase
}
func (m *AddMissingUserSaltAndRandsMigration) Sql(dialect Dialect) string {
return "code migration"
}
type TempUserDTO struct {
Id int64
Login string
}
func (m *AddMissingUserSaltAndRandsMigration) Exec(sess *xorm.Session, mg *Migrator) error {
users := make([]*TempUserDTO, 0)
err := sess.Sql(fmt.Sprintf("SELECT id, login from %s WHERE rands = ''", mg.Dialect.Quote("user"))).Find(&users)
if err != nil {
return err
}
for _, user := range users {
_, err := sess.Exec("UPDATE "+mg.Dialect.Quote("user")+" SET salt = ?, rands = ? WHERE id = ?", util.GetRandomString(10), util.GetRandomString(10), user.Id)
if err != nil {
return err
}
}
return nil
}

View File

@@ -12,7 +12,7 @@ import (
type Migrator struct {
x *xorm.Engine
dialect Dialect
Dialect Dialect
migrations []Migration
Logger log.Logger
}
@@ -31,7 +31,7 @@ func NewMigrator(engine *xorm.Engine) *Migrator {
mg.x = engine
mg.Logger = log.New("migrator")
mg.migrations = make([]Migration, 0)
mg.dialect = NewDialect(mg.x)
mg.Dialect = NewDialect(mg.x)
return mg
}
@@ -86,7 +86,7 @@ func (mg *Migrator) Start() error {
continue
}
sql := m.Sql(mg.dialect)
sql := m.Sql(mg.Dialect)
record := MigrationLog{
MigrationId: m.Id(),
@@ -122,7 +122,7 @@ func (mg *Migrator) exec(m Migration, sess *xorm.Session) error {
condition := m.GetCondition()
if condition != nil {
sql, args := condition.Sql(mg.dialect)
sql, args := condition.Sql(mg.Dialect)
results, err := sess.SQL(sql).Query(args...)
if err != nil || len(results) == 0 {
mg.Logger.Debug("Skipping migration condition not fulfilled", "id", m.Id())
@@ -130,7 +130,13 @@ func (mg *Migrator) exec(m Migration, sess *xorm.Session) error {
}
}
_, err := sess.Exec(m.Sql(mg.dialect))
var err error
if codeMigration, ok := m.(CodeMigration); ok {
err = codeMigration.Exec(sess, mg)
} else {
_, err = sess.Exec(m.Sql(mg.Dialect))
}
if err != nil {
mg.Logger.Error("Executing migration failed", "id", m.Id(), "error", err)
return err

View File

@@ -3,6 +3,8 @@ package migrator
import (
"fmt"
"strings"
"github.com/go-xorm/xorm"
)
const (
@@ -19,6 +21,11 @@ type Migration interface {
GetCondition() MigrationCondition
}
type CodeMigration interface {
Migration
Exec(sess *xorm.Session, migrator *Migrator) error
}
type SQLType string
type ColumnType string

View File

@@ -150,7 +150,7 @@ func TestAccountDataAccess(t *testing.T) {
})
Convey("Can set using org", func() {
cmd := m.SetUsingOrgCommand{UserId: ac2.Id, OrgId: ac1.Id}
cmd := m.SetUsingOrgCommand{UserId: ac2.Id, OrgId: ac1.OrgId}
err := SetUsingOrg(&cmd)
So(err, ShouldBeNil)
@@ -159,13 +159,25 @@ func TestAccountDataAccess(t *testing.T) {
err := GetSignedInUser(&query)
So(err, ShouldBeNil)
So(query.Result.OrgId, ShouldEqual, ac1.Id)
So(query.Result.OrgId, ShouldEqual, ac1.OrgId)
So(query.Result.Email, ShouldEqual, "ac2@test.com")
So(query.Result.Name, ShouldEqual, "ac2 name")
So(query.Result.Login, ShouldEqual, "ac2")
So(query.Result.OrgName, ShouldEqual, "ac1@test.com")
So(query.Result.OrgRole, ShouldEqual, "Viewer")
})
Convey("Should set last org as current when removing user from current", func() {
remCmd := m.RemoveOrgUserCommand{OrgId: ac1.OrgId, UserId: ac2.Id}
err := RemoveOrgUser(&remCmd)
So(err, ShouldBeNil)
query := m.GetSignedInUserQuery{UserId: ac2.Id}
err = GetSignedInUser(&query)
So(err, ShouldBeNil)
So(query.Result.OrgId, ShouldEqual, ac2.OrgId)
})
})
Convey("Cannot delete last admin org user", func() {

View File

@@ -20,7 +20,14 @@ func init() {
func AddOrgUser(cmd *m.AddOrgUserCommand) error {
return inTransaction(func(sess *DBSession) error {
// check if user exists
if res, err := sess.Query("SELECT 1 from org_user WHERE org_id=? and user_id=?", cmd.OrgId, cmd.UserId); err != nil {
var user m.User
if exists, err := sess.Id(cmd.UserId).Get(&user); err != nil {
return err
} else if !exists {
return m.ErrUserNotFound
}
if res, err := sess.Query("SELECT 1 from org_user WHERE org_id=? and user_id=?", cmd.OrgId, user.Id); err != nil {
return err
} else if len(res) == 1 {
return m.ErrOrgUserAlreadyAdded
@@ -41,7 +48,26 @@ func AddOrgUser(cmd *m.AddOrgUserCommand) error {
}
_, err := sess.Insert(&entity)
return err
if err != nil {
return err
}
var userOrgs []*m.UserOrgDTO
sess.Table("org_user")
sess.Join("INNER", "org", "org_user.org_id=org.id")
sess.Where("org_user.user_id=? AND org_user.org_id=?", user.Id, user.OrgId)
sess.Cols("org.name", "org_user.role", "org_user.org_id")
err = sess.Find(&userOrgs)
if err != nil {
return err
}
if len(userOrgs) == 0 {
return setUsingOrgInTransaction(sess, user.Id, cmd.OrgId)
}
return nil
})
}
@@ -110,6 +136,14 @@ func GetOrgUsers(query *m.GetOrgUsersQuery) error {
func RemoveOrgUser(cmd *m.RemoveOrgUserCommand) error {
return inTransaction(func(sess *DBSession) error {
// check if user exists
var user m.User
if exists, err := sess.Id(cmd.UserId).Get(&user); err != nil {
return err
} else if !exists {
return m.ErrUserNotFound
}
deletes := []string{
"DELETE FROM org_user WHERE org_id=? and user_id=?",
"DELETE FROM dashboard_acl WHERE org_id=? and user_id = ?",
@@ -123,6 +157,32 @@ func RemoveOrgUser(cmd *m.RemoveOrgUserCommand) error {
}
}
var userOrgs []*m.UserOrgDTO
sess.Table("org_user")
sess.Join("INNER", "org", "org_user.org_id=org.id")
sess.Where("org_user.user_id=?", user.Id)
sess.Cols("org.name", "org_user.role", "org_user.org_id")
err := sess.Find(&userOrgs)
if err != nil {
return err
}
hasCurrentOrgSet := false
for _, userOrg := range userOrgs {
if user.OrgId == userOrg.OrgId {
hasCurrentOrgSet = true
break
}
}
if !hasCurrentOrgSet && len(userOrgs) > 0 {
err = setUsingOrgInTransaction(sess, user.Id, userOrgs[0].OrgId)
if err != nil {
return err
}
}
return validateOneAdminLeftInOrg(cmd.OrgId, sess)
})
}

View File

@@ -104,9 +104,10 @@ func CreateUser(cmd *m.CreateUserCommand) error {
LastSeenAt: time.Now().AddDate(-10, 0, 0),
}
user.Salt = util.GetRandomString(10)
user.Rands = util.GetRandomString(10)
if len(cmd.Password) > 0 {
user.Salt = util.GetRandomString(10)
user.Rands = util.GetRandomString(10)
user.Password = util.EncodePassword(cmd.Password, user.Salt)
}
@@ -290,16 +291,20 @@ func SetUsingOrg(cmd *m.SetUsingOrgCommand) error {
}
return inTransaction(func(sess *DBSession) error {
user := m.User{
Id: cmd.UserId,
OrgId: cmd.OrgId,
}
_, err := sess.Id(cmd.UserId).Update(&user)
return err
return setUsingOrgInTransaction(sess, cmd.UserId, cmd.OrgId)
})
}
func setUsingOrgInTransaction(sess *DBSession, userID int64, orgID int64) error {
user := m.User{
Id: userID,
OrgId: orgID,
}
_, err := sess.Id(userID).Update(&user)
return err
}
func GetUserProfile(query *m.GetUserProfileQuery) error {
var user m.User
has, err := x.Id(query.UserId).Get(&user)

View File

@@ -14,6 +14,28 @@ func TestUserDataAccess(t *testing.T) {
Convey("Testing DB", t, func() {
InitTestDB(t)
Convey("Creating a user", func() {
cmd := &m.CreateUserCommand{
Email: "usertest@test.com",
Name: "user name",
Login: "user_test_login",
}
err := CreateUser(cmd)
So(err, ShouldBeNil)
Convey("Loading a user", func() {
query := m.GetUserByIdQuery{Id: cmd.Result.Id}
err := GetUserById(&query)
So(err, ShouldBeNil)
So(query.Result.Email, ShouldEqual, "usertest@test.com")
So(query.Result.Password, ShouldEqual, "")
So(query.Result.Rands, ShouldHaveLength, 10)
So(query.Result.Salt, ShouldHaveLength, 10)
})
})
Convey("Given 5 users", func() {
var err error
var cmd *m.CreateUserCommand
@@ -96,33 +118,33 @@ func TestUserDataAccess(t *testing.T) {
})
Convey("when a user is an org member and has been assigned permissions", func() {
err = AddOrgUser(&m.AddOrgUserCommand{LoginOrEmail: users[0].Login, Role: m.ROLE_VIEWER, OrgId: users[0].OrgId})
err = AddOrgUser(&m.AddOrgUserCommand{LoginOrEmail: users[1].Login, Role: m.ROLE_VIEWER, OrgId: users[0].OrgId, UserId: users[1].Id})
So(err, ShouldBeNil)
testHelperUpdateDashboardAcl(1, m.DashboardAcl{DashboardId: 1, OrgId: users[0].OrgId, UserId: users[0].Id, Permission: m.PERMISSION_EDIT})
testHelperUpdateDashboardAcl(1, m.DashboardAcl{DashboardId: 1, OrgId: users[0].OrgId, UserId: users[1].Id, Permission: m.PERMISSION_EDIT})
So(err, ShouldBeNil)
err = SavePreferences(&m.SavePreferencesCommand{UserId: users[0].Id, OrgId: users[0].OrgId, HomeDashboardId: 1, Theme: "dark"})
err = SavePreferences(&m.SavePreferencesCommand{UserId: users[1].Id, OrgId: users[0].OrgId, HomeDashboardId: 1, Theme: "dark"})
So(err, ShouldBeNil)
Convey("when the user is deleted", func() {
err = DeleteUser(&m.DeleteUserCommand{UserId: users[0].Id})
err = DeleteUser(&m.DeleteUserCommand{UserId: users[1].Id})
So(err, ShouldBeNil)
Convey("Should delete connected org users and permissions", func() {
query := &m.GetOrgUsersQuery{OrgId: 1}
query := &m.GetOrgUsersQuery{OrgId: users[0].OrgId}
err = GetOrgUsersForTest(query)
So(err, ShouldBeNil)
So(len(query.Result), ShouldEqual, 1)
permQuery := &m.GetDashboardAclInfoListQuery{DashboardId: 1, OrgId: 1}
permQuery := &m.GetDashboardAclInfoListQuery{DashboardId: 1, OrgId: users[0].OrgId}
err = GetDashboardAclInfoList(permQuery)
So(err, ShouldBeNil)
So(len(permQuery.Result), ShouldEqual, 0)
prefsQuery := &m.GetPreferencesQuery{OrgId: users[0].OrgId, UserId: users[0].Id}
prefsQuery := &m.GetPreferencesQuery{OrgId: users[0].OrgId, UserId: users[1].Id}
err = GetPreferences(prefsQuery)
So(err, ShouldBeNil)

View File

@@ -32,7 +32,7 @@ func TestSearchRequest(t *testing.T) {
Convey("When marshal to JSON should generate correct json", func() {
body, err := json.Marshal(sr)
So(err, ShouldBeNil)
json, err := simplejson.NewJson([]byte(body))
json, err := simplejson.NewJson(body)
So(err, ShouldBeNil)
So(json.Get("size").MustInt(500), ShouldEqual, 0)
So(json.Get("sort").Interface(), ShouldBeNil)
@@ -81,7 +81,7 @@ func TestSearchRequest(t *testing.T) {
Convey("When marshal to JSON should generate correct json", func() {
body, err := json.Marshal(sr)
So(err, ShouldBeNil)
json, err := simplejson.NewJson([]byte(body))
json, err := simplejson.NewJson(body)
So(err, ShouldBeNil)
So(json.Get("size").MustInt(0), ShouldEqual, 200)
@@ -124,7 +124,7 @@ func TestSearchRequest(t *testing.T) {
Convey("When marshal to JSON should generate correct json", func() {
body, err := json.Marshal(sr)
So(err, ShouldBeNil)
json, err := simplejson.NewJson([]byte(body))
json, err := simplejson.NewJson(body)
So(err, ShouldBeNil)
scriptFields, err := json.Get("script_fields").Map()
@@ -163,7 +163,7 @@ func TestSearchRequest(t *testing.T) {
Convey("When marshal to JSON should generate correct json", func() {
body, err := json.Marshal(sr)
So(err, ShouldBeNil)
json, err := simplejson.NewJson([]byte(body))
json, err := simplejson.NewJson(body)
So(err, ShouldBeNil)
So(json.Get("aggs").MustMap(), ShouldHaveLength, 2)
@@ -200,7 +200,7 @@ func TestSearchRequest(t *testing.T) {
Convey("When marshal to JSON should generate correct json", func() {
body, err := json.Marshal(sr)
So(err, ShouldBeNil)
json, err := simplejson.NewJson([]byte(body))
json, err := simplejson.NewJson(body)
So(err, ShouldBeNil)
So(json.Get("aggs").MustMap(), ShouldHaveLength, 1)
@@ -251,7 +251,7 @@ func TestSearchRequest(t *testing.T) {
Convey("When marshal to JSON should generate correct json", func() {
body, err := json.Marshal(sr)
So(err, ShouldBeNil)
json, err := simplejson.NewJson([]byte(body))
json, err := simplejson.NewJson(body)
So(err, ShouldBeNil)
topAggOne := json.GetPath("aggs", "1")
@@ -300,7 +300,7 @@ func TestSearchRequest(t *testing.T) {
Convey("When marshal to JSON should generate correct json", func() {
body, err := json.Marshal(sr)
So(err, ShouldBeNil)
json, err := simplejson.NewJson([]byte(body))
json, err := simplejson.NewJson(body)
So(err, ShouldBeNil)
topAgg := json.GetPath("aggs", "1")
@@ -364,7 +364,7 @@ func TestSearchRequest(t *testing.T) {
Convey("When marshal to JSON should generate correct json", func() {
body, err := json.Marshal(sr)
So(err, ShouldBeNil)
json, err := simplejson.NewJson([]byte(body))
json, err := simplejson.NewJson(body)
So(err, ShouldBeNil)
termsAgg := json.GetPath("aggs", "1")
@@ -419,7 +419,7 @@ func TestSearchRequest(t *testing.T) {
Convey("When marshal to JSON should generate correct json", func() {
body, err := json.Marshal(sr)
So(err, ShouldBeNil)
json, err := simplejson.NewJson([]byte(body))
json, err := simplejson.NewJson(body)
So(err, ShouldBeNil)
scriptFields, err := json.Get("script_fields").Map()

View File

@@ -525,7 +525,7 @@ func TestMSSQL(t *testing.T) {
So(queryResult.Error, ShouldBeNil)
So(len(queryResult.Series), ShouldEqual, 1)
So(queryResult.Series[0].Points[0][1].Float64, ShouldEqual, float64(float64(float32(tInitial.Unix())))*1e3)
So(queryResult.Series[0].Points[0][1].Float64, ShouldEqual, float64(float32(tInitial.Unix()))*1e3)
})
Convey("When doing a metric query using epoch (float32 nullable) as time column and value column (float32 nullable) should return metric with time in milliseconds", func() {
@@ -547,7 +547,7 @@ func TestMSSQL(t *testing.T) {
So(queryResult.Error, ShouldBeNil)
So(len(queryResult.Series), ShouldEqual, 1)
So(queryResult.Series[0].Points[0][1].Float64, ShouldEqual, float64(float64(float32(tInitial.Unix())))*1e3)
So(queryResult.Series[0].Points[0][1].Float64, ShouldEqual, float64(float32(tInitial.Unix()))*1e3)
})
Convey("When doing a metric query grouping by time and select metric column should return correct series", func() {
@@ -924,7 +924,7 @@ func TestMSSQL(t *testing.T) {
columns := queryResult.Tables[0].Rows[0]
//Should be in milliseconds
So(columns[0].(int64), ShouldEqual, int64(dt.Unix()*1000))
So(columns[0].(int64), ShouldEqual, dt.Unix()*1000)
})
Convey("When doing an annotation query with a time column in epoch second format (int) should return ms", func() {
@@ -954,7 +954,7 @@ func TestMSSQL(t *testing.T) {
columns := queryResult.Tables[0].Rows[0]
//Should be in milliseconds
So(columns[0].(int64), ShouldEqual, int64(dt.Unix()*1000))
So(columns[0].(int64), ShouldEqual, dt.Unix()*1000)
})
Convey("When doing an annotation query with a time column in epoch millisecond format should return ms", func() {

View File

@@ -132,8 +132,8 @@ func TestMySQL(t *testing.T) {
So(column[7].(float64), ShouldEqual, 1.11)
So(column[8].(float64), ShouldEqual, 2.22)
So(*column[9].(*float32), ShouldEqual, 3.33)
So(column[10].(time.Time), ShouldHappenWithin, time.Duration(10*time.Second), time.Now())
So(column[11].(time.Time), ShouldHappenWithin, time.Duration(10*time.Second), time.Now())
So(column[10].(time.Time), ShouldHappenWithin, 10*time.Second, time.Now())
So(column[11].(time.Time), ShouldHappenWithin, 10*time.Second, time.Now())
So(column[12].(string), ShouldEqual, "11:11:11")
So(column[13].(int64), ShouldEqual, 2018)
So(*column[14].(*[]byte), ShouldHaveSameTypeAs, []byte{1})
@@ -571,7 +571,7 @@ func TestMySQL(t *testing.T) {
So(queryResult.Error, ShouldBeNil)
So(len(queryResult.Series), ShouldEqual, 1)
So(queryResult.Series[0].Points[0][1].Float64, ShouldEqual, float64(float64(float32(tInitial.Unix())))*1e3)
So(queryResult.Series[0].Points[0][1].Float64, ShouldEqual, float64(float32(tInitial.Unix()))*1e3)
})
Convey("When doing a metric query using epoch (float32 nullable) as time column and value column (float32 nullable) should return metric with time in milliseconds", func() {
@@ -593,7 +593,7 @@ func TestMySQL(t *testing.T) {
So(queryResult.Error, ShouldBeNil)
So(len(queryResult.Series), ShouldEqual, 1)
So(queryResult.Series[0].Points[0][1].Float64, ShouldEqual, float64(float64(float32(tInitial.Unix())))*1e3)
So(queryResult.Series[0].Points[0][1].Float64, ShouldEqual, float64(float32(tInitial.Unix()))*1e3)
})
Convey("When doing a metric query grouping by time and select metric column should return correct series", func() {
@@ -810,7 +810,7 @@ func TestMySQL(t *testing.T) {
columns := queryResult.Tables[0].Rows[0]
//Should be in milliseconds
So(columns[0].(int64), ShouldEqual, int64(dt.Unix()*1000))
So(columns[0].(int64), ShouldEqual, dt.Unix()*1000)
})
Convey("When doing an annotation query with a time column in epoch millisecond format should return ms", func() {

View File

@@ -504,7 +504,7 @@ func TestPostgres(t *testing.T) {
So(queryResult.Error, ShouldBeNil)
So(len(queryResult.Series), ShouldEqual, 1)
So(queryResult.Series[0].Points[0][1].Float64, ShouldEqual, float64(float64(float32(tInitial.Unix())))*1e3)
So(queryResult.Series[0].Points[0][1].Float64, ShouldEqual, float64(float32(tInitial.Unix()))*1e3)
})
Convey("When doing a metric query using epoch (float32 nullable) as time column and value column (float32 nullable) should return metric with time in milliseconds", func() {
@@ -526,7 +526,7 @@ func TestPostgres(t *testing.T) {
So(queryResult.Error, ShouldBeNil)
So(len(queryResult.Series), ShouldEqual, 1)
So(queryResult.Series[0].Points[0][1].Float64, ShouldEqual, float64(float64(float32(tInitial.Unix())))*1e3)
So(queryResult.Series[0].Points[0][1].Float64, ShouldEqual, float64(float32(tInitial.Unix()))*1e3)
})
Convey("When doing a metric query grouping by time and select metric column should return correct series", func() {
@@ -713,7 +713,7 @@ func TestPostgres(t *testing.T) {
columns := queryResult.Tables[0].Rows[0]
//Should be in milliseconds
So(columns[0].(int64), ShouldEqual, int64(dt.Unix()*1000))
So(columns[0].(int64), ShouldEqual, dt.Unix()*1000)
})
Convey("When doing an annotation query with a time column in epoch second format (int) should return ms", func() {
@@ -743,7 +743,7 @@ func TestPostgres(t *testing.T) {
columns := queryResult.Tables[0].Rows[0]
//Should be in milliseconds
So(columns[0].(int64), ShouldEqual, int64(dt.Unix()*1000))
So(columns[0].(int64), ShouldEqual, dt.Unix()*1000)
})
Convey("When doing an annotation query with a time column in epoch millisecond format should return ms", func() {

View File

@@ -68,6 +68,7 @@ func (e *DefaultSqlEngine) InitEngine(driverName string, dsInfo *models.DataSour
engine.SetMaxOpenConns(10)
engine.SetMaxIdleConns(10)
engineCache.versions[dsInfo.Id] = dsInfo.Version
engineCache.cache[dsInfo.Id] = engine
e.XormEngine = engine

View File

@@ -0,0 +1,25 @@
import * as ticks from '../utils/ticks';
describe('ticks', () => {
describe('getFlotTickDecimals()', () => {
let ctx: any = {};
beforeEach(() => {
ctx.axis = {};
});
it('should calculate decimals precision based on graph height', () => {
let dec = ticks.getFlotTickDecimals(0, 10, ctx.axis, 200);
expect(dec.tickDecimals).toBe(1);
expect(dec.scaledDecimals).toBe(1);
dec = ticks.getFlotTickDecimals(0, 100, ctx.axis, 200);
expect(dec.tickDecimals).toBe(0);
expect(dec.scaledDecimals).toBe(-1);
dec = ticks.getFlotTickDecimals(0, 1, ctx.axis, 200);
expect(dec.tickDecimals).toBe(2);
expect(dec.scaledDecimals).toBe(3);
});
});
});

View File

@@ -1,4 +1,5 @@
import TimeSeries from 'app/core/time_series2';
import { updateLegendValues } from 'app/core/time_series2';
describe('TimeSeries', function() {
var points, series;
@@ -118,6 +119,20 @@ describe('TimeSeries', function() {
series.getFlotPairs('null');
expect(series.stats.avg).toBe(null);
});
it('calculates timeStep', function() {
series = new TimeSeries({
datapoints: [[null, 1], [null, 2], [null, 3]],
});
series.getFlotPairs('null');
expect(series.stats.timeStep).toBe(1);
series = new TimeSeries({
datapoints: [[0, 1530529290], [0, 1530529305], [0, 1530529320]],
});
series.getFlotPairs('null');
expect(series.stats.timeStep).toBe(15);
});
});
describe('When checking if ms resolution is needed', function() {
@@ -311,4 +326,55 @@ describe('TimeSeries', function() {
expect(series.formatValue(-Infinity)).toBe('');
});
});
describe('legend decimals', function() {
let series, panel;
let height = 200;
beforeEach(function() {
testData = {
alias: 'test',
datapoints: [[1, 2], [0, 3], [10, 4], [8, 5]],
};
series = new TimeSeries(testData);
series.getFlotPairs();
panel = {
decimals: null,
yaxes: [
{
decimals: null,
},
],
};
});
it('should set decimals based on Y axis (expect calculated decimals = 1)', function() {
let data = [series];
// Expect ticks with this data will have decimals = 1
updateLegendValues(data, panel, height);
expect(data[0].decimals).toBe(2);
});
it('should set decimals based on Y axis to 0 if calculated decimals = 0)', function() {
testData.datapoints = [[10, 2], [0, 3], [100, 4], [80, 5]];
series = new TimeSeries(testData);
series.getFlotPairs();
let data = [series];
updateLegendValues(data, panel, height);
expect(data[0].decimals).toBe(0);
});
it('should set decimals to Y axis decimals + 1', function() {
panel.yaxes[0].decimals = 2;
let data = [series];
updateLegendValues(data, panel, height);
expect(data[0].decimals).toBe(3);
});
it('should set decimals to legend decimals value if it was set explicitly', function() {
panel.decimals = 3;
let data = [series];
updateLegendValues(data, panel, height);
expect(data[0].decimals).toBe(3);
});
});
});

View File

@@ -23,23 +23,27 @@ function translateFillOption(fill) {
* Calculate decimals for legend and update values for each series.
* @param data series data
* @param panel
* @param height
*/
export function updateLegendValues(data: TimeSeries[], panel) {
export function updateLegendValues(data: TimeSeries[], panel, height) {
for (let i = 0; i < data.length; i++) {
let series = data[i];
let yaxes = panel.yaxes;
const yaxes = panel.yaxes;
const seriesYAxis = series.yaxis || 1;
let axis = yaxes[seriesYAxis - 1];
let { tickDecimals, scaledDecimals } = getFlotTickDecimals(data, axis);
let formater = kbn.valueFormats[panel.yaxes[seriesYAxis - 1].format];
const axis = yaxes[seriesYAxis - 1];
let formater = kbn.valueFormats[axis.format];
// decimal override
if (_.isNumber(panel.decimals)) {
series.updateLegendValues(formater, panel.decimals, null);
} else if (_.isNumber(axis.decimals)) {
series.updateLegendValues(formater, axis.decimals + 1, null);
} else {
// auto decimals
// legend and tooltip gets one more decimal precision
// than graph legend ticks
const { datamin, datamax } = getDataMinMax(data);
let { tickDecimals, scaledDecimals } = getFlotTickDecimals(datamin, datamax, axis, height);
tickDecimals = (tickDecimals || -1) + 1;
series.updateLegendValues(formater, tickDecimals, scaledDecimals + 2);
}

View File

@@ -1,5 +1,3 @@
import { getDataMinMax } from 'app/core/time_series2';
/**
* Calculate tick step.
* Implementation from d3-array (ticks.js)
@@ -121,12 +119,10 @@ export function getFlotRange(panelMin, panelMax, datamin, datamax) {
* Calculate tick decimals.
* Implementation from Flot.
*/
export function getFlotTickDecimals(data, axis) {
let { datamin, datamax } = getDataMinMax(data);
let { min, max } = getFlotRange(axis.min, axis.max, datamin, datamax);
let noTicks = 3;
let tickDecimals, maxDec;
let delta = (max - min) / noTicks;
export function getFlotTickDecimals(datamin, datamax, axis, height) {
const { min, max } = getFlotRange(axis.min, axis.max, datamin, datamax);
const noTicks = 0.3 * Math.sqrt(height);
const delta = (max - min) / noTicks;
let dec = -Math.floor(Math.log(delta) / Math.LN10);
let magn = Math.pow(10, -dec);
@@ -139,19 +135,17 @@ export function getFlotTickDecimals(data, axis) {
} else if (norm < 3) {
size = 2;
// special case for 2.5, requires an extra decimal
if (norm > 2.25 && (maxDec == null || dec + 1 <= maxDec)) {
if (norm > 2.25) {
size = 2.5;
++dec;
}
} else if (norm < 7.5) {
size = 5;
} else {
size = 10;
}
size *= magn;
tickDecimals = Math.max(0, maxDec != null ? maxDec : dec);
const tickDecimals = Math.max(0, -Math.floor(Math.log(delta) / Math.LN10) + 1);
// grafana addition
const scaledDecimals = tickDecimals - Math.floor(Math.log(size) / Math.LN10);
return { tickDecimals, scaledDecimals };

View File

@@ -75,6 +75,7 @@ export class AdminEditUserCtrl {
$scope.removeOrgUser = function(orgUser) {
backendSrv.delete('/api/orgs/' + orgUser.orgId + '/users/' + $scope.user_id).then(function() {
$scope.getUser($scope.user_id);
$scope.getUserOrgs($scope.user_id);
});
};
@@ -108,6 +109,7 @@ export class AdminEditUserCtrl {
$scope.newOrg.loginOrEmail = $scope.user.login;
backendSrv.post('/api/orgs/' + orgInfo.id + '/users/', $scope.newOrg).then(function() {
$scope.getUser($scope.user_id);
$scope.getUserOrgs($scope.user_id);
});
};

View File

@@ -22,10 +22,10 @@ export class DashboardModel {
editable: any;
graphTooltip: any;
time: any;
originalTime: any;
private originalTime: any;
timepicker: any;
templating: any;
originalTemplating: any;
private originalTemplating: any;
annotations: any;
refresh: any;
snapshot: any;
@@ -50,6 +50,8 @@ export class DashboardModel {
meta: true,
panels: true, // needs special handling
templating: true, // needs special handling
originalTime: true,
originalTemplating: true,
};
constructor(data, meta?) {
@@ -70,12 +72,8 @@ export class DashboardModel {
this.editable = data.editable !== false;
this.graphTooltip = data.graphTooltip || 0;
this.time = data.time || { from: 'now-6h', to: 'now' };
this.originalTime = _.cloneDeep(this.time);
this.timepicker = data.timepicker || {};
this.templating = this.ensureListExist(data.templating);
this.originalTemplating = _.map(this.templating.list, variable => {
return { name: variable.name, current: _.clone(variable.current) };
});
this.annotations = this.ensureListExist(data.annotations);
this.refresh = data.refresh;
this.snapshot = data.snapshot;
@@ -85,6 +83,9 @@ export class DashboardModel {
this.gnetId = data.gnetId || null;
this.panels = _.map(data.panels || [], panelData => new PanelModel(panelData));
this.resetOriginalVariables();
this.resetOriginalTime();
this.initMeta(meta);
this.updateSchema(data);
@@ -138,8 +139,8 @@ export class DashboardModel {
// cleans meta data and other non persistent state
getSaveModelClone(options?) {
let defaults = _.defaults(options || {}, {
saveVariables: false,
saveTimerange: false,
saveVariables: true,
saveTimerange: true,
});
// make clone
@@ -153,15 +154,23 @@ export class DashboardModel {
}
// get variable save models
//console.log(this.templating.list);
copy.templating = {
list: _.map(this.templating.list, variable => (variable.getSaveModel ? variable.getSaveModel() : variable)),
};
if (!defaults.saveVariables && copy.templating.list.length === this.originalTemplating.length) {
if (!defaults.saveVariables) {
for (let i = 0; i < copy.templating.list.length; i++) {
if (copy.templating.list[i].name === this.originalTemplating[i].name) {
copy.templating.list[i].current = this.originalTemplating[i].current;
let current = copy.templating.list[i];
let original = _.find(this.originalTemplating, { name: current.name, type: current.type });
if (!original) {
continue;
}
if (current.type === 'adhoc') {
copy.templating.list[i].filters = original.filters;
} else {
copy.templating.list[i].current = original.current;
}
}
}
@@ -785,4 +794,40 @@ export class DashboardModel {
let migrator = new DashboardMigrator(this);
migrator.updateSchema(old);
}
resetOriginalTime() {
this.originalTime = _.cloneDeep(this.time);
}
hasTimeChanged() {
return !_.isEqual(this.time, this.originalTime);
}
resetOriginalVariables() {
this.originalTemplating = _.map(this.templating.list, variable => {
return {
name: variable.name,
type: variable.type,
current: _.cloneDeep(variable.current),
filters: _.cloneDeep(variable.filters),
};
});
}
hasVariableValuesChanged() {
if (this.templating.list.length !== this.originalTemplating.length) {
return false;
}
const updated = _.map(this.templating.list, variable => {
return {
name: variable.name,
type: variable.type,
current: _.cloneDeep(variable.current),
filters: _.cloneDeep(variable.filters),
};
});
return !_.isEqual(updated, this.originalTemplating);
}
}

View File

@@ -63,8 +63,7 @@ export class DashboardExporter {
);
};
// check up panel data sources
for (let panel of saveModel.panels) {
const processPanel = panel => {
if (panel.datasource !== undefined) {
templateizeDatasourceUsage(panel);
}
@@ -86,6 +85,18 @@ export class DashboardExporter {
version: panelDef.info.version,
};
}
};
// check up panel data sources
for (let panel of saveModel.panels) {
processPanel(panel);
// handle collapsed rows
if (panel.collapsed !== undefined && panel.collapsed === true && panel.panels) {
for (let rowPanel of panel.panels) {
processPanel(rowPanel);
}
}
}
// templatize template vars

View File

@@ -1,5 +1,4 @@
import coreModule from 'app/core/core_module';
import _ from 'lodash';
const template = `
<div class="modal-body">
@@ -70,7 +69,6 @@ export class SaveDashboardModalCtrl {
message: string;
saveVariables = false;
saveTimerange = false;
templating: any;
time: any;
originalTime: any;
current = [];
@@ -87,40 +85,8 @@ export class SaveDashboardModalCtrl {
this.message = '';
this.max = 64;
this.isSaving = false;
this.templating = dashboardSrv.dash.templating.list;
this.compareTemplating();
this.compareTime();
}
compareTime() {
if (_.isEqual(this.dashboardSrv.dash.time, this.dashboardSrv.dash.originalTime)) {
this.timeChange = false;
} else {
this.timeChange = true;
}
}
compareTemplating() {
//checks if variables has been added or removed, if so variables will be saved automatically
if (this.dashboardSrv.dash.originalTemplating.length !== this.dashboardSrv.dash.templating.list.length) {
return (this.variableValueChange = false);
}
//checks if variable value has changed
if (this.dashboardSrv.dash.templating.list.length > 0) {
for (let i = 0; i < this.dashboardSrv.dash.templating.list.length; i++) {
if (
this.dashboardSrv.dash.templating.list[i].current.text !==
this.dashboardSrv.dash.originalTemplating[i].current.text
) {
return (this.variableValueChange = true);
}
}
return (this.variableValueChange = false);
} else {
return (this.variableValueChange = false);
}
this.timeChange = this.dashboardSrv.getCurrent().hasTimeChanged();
this.variableValueChange = this.dashboardSrv.getCurrent().hasVariableValuesChanged();
}
save() {
@@ -139,7 +105,19 @@ export class SaveDashboardModalCtrl {
this.isSaving = true;
return this.dashboardSrv.save(saveModel, options).then(this.dismiss);
return this.dashboardSrv.save(saveModel, options).then(this.postSave.bind(this, options));
}
postSave(options) {
if (options.saveVariables) {
this.dashboardSrv.getCurrent().resetOriginalVariables();
}
if (options.saveTimerange) {
this.dashboardSrv.getCurrent().resetOriginalTime();
}
this.dismiss();
}
}

View File

@@ -123,6 +123,9 @@ export class ShareSnapshotCtrl {
enable: annotation.enable,
iconColor: annotation.iconColor,
snapshotData: annotation.snapshotData,
type: annotation.type,
builtIn: annotation.builtIn,
hide: annotation.hide,
};
})
.value();

View File

@@ -435,8 +435,67 @@ describe('DashboardModel', function() {
});
});
describe('save variables and timeline', () => {
let model;
describe('Given model with time', () => {
let model: DashboardModel;
beforeEach(() => {
model = new DashboardModel({
time: {
from: 'now-6h',
to: 'now',
},
});
expect(model.hasTimeChanged()).toBeFalsy();
model.time = {
from: 'now-3h',
to: 'now-1h',
};
});
it('hasTimeChanged should be true', () => {
expect(model.hasTimeChanged()).toBeTruthy();
});
it('getSaveModelClone should return original time when saveTimerange=false', () => {
let options = { saveTimerange: false };
let saveModel = model.getSaveModelClone(options);
expect(saveModel.time.from).toBe('now-6h');
expect(saveModel.time.to).toBe('now');
});
it('getSaveModelClone should return updated time when saveTimerange=true', () => {
let options = { saveTimerange: true };
let saveModel = model.getSaveModelClone(options);
expect(saveModel.time.from).toBe('now-3h');
expect(saveModel.time.to).toBe('now-1h');
});
it('hasTimeChanged should be false when reset original time', () => {
model.resetOriginalTime();
expect(model.hasTimeChanged()).toBeFalsy();
});
it('getSaveModelClone should return original time when saveTimerange=false', () => {
let options = { saveTimerange: false };
let saveModel = model.getSaveModelClone(options);
expect(saveModel.time.from).toBe('now-6h');
expect(saveModel.time.to).toBe('now');
});
it('getSaveModelClone should return updated time when saveTimerange=true', () => {
let options = { saveTimerange: true };
let saveModel = model.getSaveModelClone(options);
expect(saveModel.time.from).toBe('now-3h');
expect(saveModel.time.to).toBe('now-1h');
});
});
describe('Given model with template variable of type query', () => {
let model: DashboardModel;
beforeEach(() => {
model = new DashboardModel({
@@ -444,6 +503,7 @@ describe('DashboardModel', function() {
list: [
{
name: 'Server',
type: 'query',
current: {
selected: true,
text: 'server_001',
@@ -452,45 +512,127 @@ describe('DashboardModel', function() {
},
],
},
time: {
from: 'now-6h',
to: 'now',
},
});
model.templating.list[0] = {
name: 'Server',
expect(model.hasVariableValuesChanged()).toBeFalsy();
});
it('hasVariableValuesChanged should be false when adding a template variable', () => {
model.templating.list.push({
name: 'Server2',
type: 'query',
current: {
selected: true,
text: 'server_002',
value: 'server_002',
},
};
model.time = {
from: 'now-3h',
to: 'now',
};
});
expect(model.hasVariableValuesChanged()).toBeFalsy();
});
it('should not save variables and timeline', () => {
let options = {
saveVariables: false,
saveTimerange: false,
};
it('hasVariableValuesChanged should be false when removing existing template variable', () => {
model.templating.list = [];
expect(model.hasVariableValuesChanged()).toBeFalsy();
});
it('hasVariableValuesChanged should be true when changing value of template variable', () => {
model.templating.list[0].current.text = 'server_002';
expect(model.hasVariableValuesChanged()).toBeTruthy();
});
it('getSaveModelClone should return original variable when saveVariables=false', () => {
model.templating.list[0].current.text = 'server_002';
let options = { saveVariables: false };
let saveModel = model.getSaveModelClone(options);
expect(saveModel.templating.list[0].current.text).toBe('server_001');
expect(saveModel.time.from).toBe('now-6h');
});
it('should save variables and timeline', () => {
let options = {
saveVariables: true,
saveTimerange: true,
};
it('getSaveModelClone should return updated variable when saveVariables=true', () => {
model.templating.list[0].current.text = 'server_002';
let options = { saveVariables: true };
let saveModel = model.getSaveModelClone(options);
expect(saveModel.templating.list[0].current.text).toBe('server_002');
expect(saveModel.time.from).toBe('now-3h');
});
});
describe('Given model with template variable of type adhoc', () => {
let model: DashboardModel;
beforeEach(() => {
model = new DashboardModel({
templating: {
list: [
{
name: 'Filter',
type: 'adhoc',
filters: [
{
key: '@hostname',
operator: '=',
value: 'server 20',
},
],
},
],
},
});
expect(model.hasVariableValuesChanged()).toBeFalsy();
});
it('hasVariableValuesChanged should be false when adding a template variable', () => {
model.templating.list.push({
name: 'Filter',
type: 'adhoc',
filters: [
{
key: '@hostname',
operator: '=',
value: 'server 1',
},
],
});
expect(model.hasVariableValuesChanged()).toBeFalsy();
});
it('hasVariableValuesChanged should be false when removing existing template variable', () => {
model.templating.list = [];
expect(model.hasVariableValuesChanged()).toBeFalsy();
});
it('hasVariableValuesChanged should be true when changing value of filter', () => {
model.templating.list[0].filters[0].value = 'server 1';
expect(model.hasVariableValuesChanged()).toBeTruthy();
});
it('hasVariableValuesChanged should be true when adding an additional condition', () => {
model.templating.list[0].filters[0].condition = 'AND';
model.templating.list[0].filters[1] = {
key: '@metric',
operator: '=',
value: 'logins.count',
};
expect(model.hasVariableValuesChanged()).toBeTruthy();
});
it('getSaveModelClone should return original variable when saveVariables=false', () => {
model.templating.list[0].filters[0].value = 'server 1';
let options = { saveVariables: false };
let saveModel = model.getSaveModelClone(options);
expect(saveModel.templating.list[0].filters[0].value).toBe('server 20');
});
it('getSaveModelClone should return updated variable when saveVariables=true', () => {
model.templating.list[0].filters[0].value = 'server 1';
let options = { saveVariables: true };
let saveModel = model.getSaveModelClone(options);
expect(saveModel.templating.list[0].filters[0].value).toBe('server 1');
});
});
});

View File

@@ -62,6 +62,27 @@ describe('given dashboard with repeated panels', () => {
type: 'graph',
},
{ id: 3, repeat: null, repeatPanelId: 2 },
{
id: 4,
collapsed: true,
panels: [
{ id: 10, datasource: 'gfdb', type: 'table' },
{ id: 11 },
{
id: 12,
datasource: '-- Mixed --',
targets: [{ datasource: 'other' }],
},
{ id: 13, datasource: '$ds' },
{
id: 14,
repeat: 'apps',
datasource: 'gfdb',
type: 'heatmap',
},
{ id: 15, repeat: null, repeatPanelId: 14 },
],
},
],
};
@@ -78,6 +99,18 @@ describe('given dashboard with repeated panels', () => {
info: { version: '1.1.0' },
};
config.panels['table'] = {
id: 'table',
name: 'Table',
info: { version: '1.1.1' },
};
config.panels['heatmap'] = {
id: 'heatmap',
name: 'Heatmap',
info: { version: '1.1.2' },
};
dash = new DashboardModel(dash, {});
var exporter = new DashboardExporter(datasourceSrvStub);
exporter.makeExportable(dash).then(clean => {
@@ -91,6 +124,11 @@ describe('given dashboard with repeated panels', () => {
expect(panel.datasource).toBe('${DS_GFDB}');
});
it('should replace datasource refs in collapsed row', () => {
var panel = exported.panels[5].panels[0];
expect(panel.datasource).toBe('${DS_GFDB}');
});
it('should replace datasource in variable query', () => {
expect(exported.templating.list[0].datasource).toBe('${DS_GFDB}');
expect(exported.templating.list[0].options.length).toBe(0);
@@ -126,13 +164,27 @@ describe('given dashboard with repeated panels', () => {
expect(require).not.toBe(undefined);
});
it('should add panel to required', () => {
it('should add graph panel to required', () => {
var require = _.find(exported.__requires, { name: 'Graph' });
expect(require.name).toBe('Graph');
expect(require.id).toBe('graph');
expect(require.version).toBe('1.1.0');
});
it('should add table panel to required', () => {
var require = _.find(exported.__requires, { name: 'Table' });
expect(require.name).toBe('Table');
expect(require.id).toBe('table');
expect(require.version).toBe('1.1.1');
});
it('should add heatmap panel to required', () => {
var require = _.find(exported.__requires, { name: 'Heatmap' });
expect(require.name).toBe('Heatmap');
expect(require.id).toBe('heatmap');
expect(require.version).toBe('1.1.2');
});
it('should add grafana version', () => {
var require = _.find(exported.__requires, { name: 'Grafana' });
expect(require.type).toBe('grafana');

View File

@@ -1,128 +1,57 @@
import { SaveDashboardModalCtrl } from '../save_modal';
jest.mock('app/core/services/context_srv', () => ({}));
const setup = (timeChanged, variableValuesChanged, cb) => {
const dash = {
hasTimeChanged: jest.fn().mockReturnValue(timeChanged),
hasVariableValuesChanged: jest.fn().mockReturnValue(variableValuesChanged),
resetOriginalTime: jest.fn(),
resetOriginalVariables: jest.fn(),
getSaveModelClone: jest.fn().mockReturnValue({}),
};
const dashboardSrvMock = {
getCurrent: jest.fn().mockReturnValue(dash),
save: jest.fn().mockReturnValue(Promise.resolve()),
};
const ctrl = new SaveDashboardModalCtrl(dashboardSrvMock);
ctrl.saveForm = {
$valid: true,
};
ctrl.dismiss = () => Promise.resolve();
cb(dash, ctrl, dashboardSrvMock);
};
describe('SaveDashboardModal', () => {
describe('save modal checkboxes', () => {
it('should show checkboxes', () => {
let fakeDashboardSrv = {
dash: {
templating: {
list: [
{
current: {
selected: true,
tags: Array(0),
text: 'server_001',
value: 'server_001',
},
name: 'Server',
},
],
},
originalTemplating: [
{
current: {
selected: true,
text: 'server_002',
value: 'server_002',
},
name: 'Server',
},
],
time: {
from: 'now-3h',
to: 'now',
},
originalTime: {
from: 'now-6h',
to: 'now',
},
},
};
let modal = new SaveDashboardModalCtrl(fakeDashboardSrv);
expect(modal.timeChange).toBe(true);
expect(modal.variableValueChange).toBe(true);
describe('Given time and template variable values have not changed', () => {
setup(false, false, (dash, ctrl: SaveDashboardModalCtrl) => {
it('When creating ctrl should set time and template variable values changed', () => {
expect(ctrl.timeChange).toBeFalsy();
expect(ctrl.variableValueChange).toBeFalsy();
});
});
});
it('should hide checkboxes', () => {
let fakeDashboardSrv = {
dash: {
templating: {
list: [
{
current: {
selected: true,
text: 'server_002',
value: 'server_002',
},
name: 'Server',
},
],
},
originalTemplating: [
{
current: {
selected: true,
text: 'server_002',
value: 'server_002',
},
name: 'Server',
},
],
time: {
from: 'now-3h',
to: 'now',
},
originalTime: {
from: 'now-3h',
to: 'now',
},
},
};
let modal = new SaveDashboardModalCtrl(fakeDashboardSrv);
expect(modal.timeChange).toBe(false);
expect(modal.variableValueChange).toBe(false);
});
describe('Given time and template variable values have changed', () => {
setup(true, true, (dash, ctrl: SaveDashboardModalCtrl) => {
it('When creating ctrl should set time and template variable values changed', () => {
expect(ctrl.timeChange).toBeTruthy();
expect(ctrl.variableValueChange).toBeTruthy();
});
it('should hide variable checkboxes', () => {
let fakeDashboardSrv = {
dash: {
templating: {
list: [
{
current: {
selected: true,
text: 'server_002',
value: 'server_002',
},
name: 'Server',
},
{
current: {
selected: true,
text: 'web_002',
value: 'web_002',
},
name: 'Web',
},
],
},
originalTemplating: [
{
current: {
selected: true,
text: 'server_002',
value: 'server_002',
},
name: 'Server',
},
],
},
};
let modal = new SaveDashboardModalCtrl(fakeDashboardSrv);
expect(modal.variableValueChange).toBe(false);
it('When save time and variable value changes disabled and saving should reset original time and template variable values', async () => {
ctrl.saveTimerange = false;
ctrl.saveVariables = false;
await ctrl.save();
expect(dash.resetOriginalTime).toHaveBeenCalledTimes(0);
expect(dash.resetOriginalVariables).toHaveBeenCalledTimes(0);
});
it('When save time and variable value changes enabled and saving should reset original time and template variable values', async () => {
ctrl.saveTimerange = true;
ctrl.saveVariables = true;
await ctrl.save();
expect(dash.resetOriginalTime).toHaveBeenCalledTimes(1);
expect(dash.resetOriginalVariables).toHaveBeenCalledTimes(1);
});
});
});
});

View File

@@ -41,18 +41,20 @@ function dashLink($compile, $sanitize, linkSrv) {
elem.html(template);
$compile(elem.contents())(scope);
var anchor = elem.find('a');
var icon = elem.find('i');
var span = elem.find('span');
function update() {
var linkInfo = linkSrv.getAnchorInfo(link);
span.text(linkInfo.title);
anchor.attr('href', linkInfo.href);
sanitizeAnchor();
const anchor = elem.find('a');
const span = elem.find('span');
span.text(linkInfo.title);
if (!link.asDropdown) {
anchor.attr('href', linkInfo.href);
sanitizeAnchor();
}
anchor.attr('data-placement', 'bottom');
// tooltip
elem.find('a').tooltip({
anchor.tooltip({
title: $sanitize(scope.link.tooltip),
html: true,
container: 'body',
@@ -60,12 +62,13 @@ function dashLink($compile, $sanitize, linkSrv) {
}
function sanitizeAnchor() {
const anchor = elem.find('a');
const anchorSanitized = $sanitize(anchor.parent().html());
anchor.parent().html(anchorSanitized);
}
icon.attr('class', 'fa fa-fw ' + scope.link.icon);
anchor.attr('target', scope.link.target);
elem.find('i').attr('class', 'fa fa-fw ' + scope.link.icon);
elem.find('a').attr('target', scope.link.target);
// fix for menus on the far right
if (link.asDropdown && scope.$last) {

View File

@@ -222,7 +222,7 @@ class MetricsPanelCtrl extends PanelCtrl {
// and add built in variables interval and interval_ms
var scopedVars = Object.assign({}, this.panel.scopedVars, {
__interval: { text: this.interval, value: this.interval },
__interval_ms: { text: this.intervalMs, value: this.intervalMs },
__interval_ms: { text: String(this.intervalMs), value: String(this.intervalMs) },
});
var metricsQuery = {

View File

@@ -7,7 +7,7 @@ export class DatasourceSrv {
datasources: any;
/** @ngInject */
constructor(private $q, private $injector, $rootScope, private templateSrv) {
constructor(private $q, private $injector, private $rootScope, private templateSrv) {
this.init();
}
@@ -61,7 +61,7 @@ export class DatasourceSrv {
this.datasources[name] = instance;
deferred.resolve(instance);
})
.catch(function(err) {
.catch(err => {
this.$rootScope.appEvent('alert-error', [dsConfig.name + ' plugin failed', err.toString()]);
});

View File

@@ -56,7 +56,7 @@ System.config({
css: 'vendor/plugin-css/css.js',
},
meta: {
'plugin*': {
'/*': {
esModule: true,
authorization: true,
loader: 'plugin-loader',

View File

@@ -179,4 +179,38 @@ describe('VariableSrv init', function() {
expect(variable.options[2].selected).to.be(false);
});
});
describeInitScenario('when template variable is present in url multiple times using key/values', scenario => {
scenario.setup(() => {
scenario.variables = [
{
name: 'apps',
type: 'query',
multi: true,
current: { text: 'Val1', value: 'val1' },
options: [
{ text: 'Val1', value: 'val1' },
{ text: 'Val2', value: 'val2' },
{ text: 'Val3', value: 'val3', selected: true },
],
},
];
scenario.urlParams['var-apps'] = ['val2', 'val1'];
});
it('should update current value', function() {
var variable = ctx.variableSrv.variables[0];
expect(variable.current.value.length).to.be(2);
expect(variable.current.value[0]).to.be('val2');
expect(variable.current.value[1]).to.be('val1');
expect(variable.current.text).to.be('Val2 + Val1');
expect(variable.options[0].selected).to.be(true);
expect(variable.options[1].selected).to.be(true);
});
it('should set options that are not in value to selected false', function() {
var variable = ctx.variableSrv.variables[0];
expect(variable.options[2].selected).to.be(false);
});
});
});

View File

@@ -209,7 +209,24 @@ export class VariableSrv {
return op.text === urlValue || op.value === urlValue;
});
option = option || { text: urlValue, value: urlValue };
let defaultText = urlValue;
let defaultValue = urlValue;
if (!option && _.isArray(urlValue)) {
defaultText = [];
for (let n = 0; n < urlValue.length; n++) {
let t = _.find(variable.options, op => {
return op.value === urlValue[n];
});
if (t) {
defaultText.push(t.text);
}
}
}
option = option || { text: defaultText, value: defaultValue };
return variable.setValue(option);
});
}

View File

@@ -11,14 +11,30 @@ export default class ResponseParser {
return [];
}
var influxdb11format = query.toLowerCase().indexOf('show tag values') >= 0;
var normalizedQuery = query.toLowerCase();
var isValueFirst =
normalizedQuery.indexOf('show field keys') >= 0 || normalizedQuery.indexOf('show retention policies') >= 0;
var res = {};
_.each(influxResults.series, serie => {
_.each(serie.values, value => {
if (_.isArray(value)) {
if (influxdb11format) {
addUnique(res, value[1] || value[0]);
// In general, there are 2 possible shapes for the returned value.
// The first one is a two-element array,
// where the first element is somewhat a metadata value:
// the tag name for SHOW TAG VALUES queries,
// the time field for SELECT queries, etc.
// The second shape is an one-element array,
// that is containing an immediate value.
// For example, SHOW FIELD KEYS queries return such shape.
// Note, pre-0.11 versions return
// the second shape for SHOW TAG VALUES queries
// (while the newer versions—first).
if (isValueFirst) {
addUnique(res, value[0]);
} else if (value[1] !== undefined) {
addUnique(res, value[1]);
} else {
addUnique(res, value[0]);
}
@@ -29,7 +45,7 @@ export default class ResponseParser {
});
return _.map(res, value => {
return { text: value };
return { text: value.toString() };
});
}
}

View File

@@ -85,30 +85,36 @@ describe('influxdb response parser', () => {
});
});
describe('SELECT response', () => {
var query = 'SELECT "usage_iowait" FROM "cpu" LIMIT 10';
var response = {
results: [
{
series: [
{
name: 'cpu',
columns: ['time', 'usage_iowait'],
values: [[1488465190006040638, 0.0], [1488465190006040638, 15.0], [1488465190006040638, 20.2]],
},
],
},
],
};
var result = parser.parse(query, response);
it('should return second column', () => {
expect(_.size(result)).toBe(3);
expect(result[0].text).toBe('0');
expect(result[1].text).toBe('15');
expect(result[2].text).toBe('20.2');
});
});
describe('SHOW FIELD response', () => {
var query = 'SHOW FIELD KEYS FROM "cpu"';
describe('response from 0.10.0', () => {
var response = {
results: [
{
series: [
{
name: 'measurements',
columns: ['name'],
values: [['cpu'], ['derivative'], ['logins.count'], ['logs'], ['payment.ended'], ['payment.started']],
},
],
},
],
};
var result = parser.parse(query, response);
it('should get two responses', () => {
expect(_.size(result)).toBe(6);
});
});
describe('response from 0.11.0', () => {
describe('response from pre-1.0', () => {
var response = {
results: [
{
@@ -129,5 +135,28 @@ describe('influxdb response parser', () => {
expect(_.size(result)).toBe(1);
});
});
describe('response from 1.0', () => {
var response = {
results: [
{
series: [
{
name: 'cpu',
columns: ['fieldKey', 'fieldType'],
values: [['time', 'float']],
},
],
},
],
};
var result = parser.parse(query, response);
it('should return first column', () => {
expect(_.size(result)).toBe(1);
expect(result[0].text).toBe('time');
});
});
});
});

View File

@@ -162,8 +162,8 @@ export class PrometheusDatasource {
format: activeTargets[index].format,
step: queries[index].step,
legendFormat: activeTargets[index].legendFormat,
start: start,
end: end,
start: queries[index].start,
end: queries[index].end,
query: queries[index].expr,
responseListLength: responseList.length,
responseIndex: index,
@@ -196,7 +196,7 @@ export class PrometheusDatasource {
interval = adjustedInterval;
scopedVars = Object.assign({}, options.scopedVars, {
__interval: { text: interval + 's', value: interval + 's' },
__interval_ms: { text: interval * 1000, value: interval * 1000 },
__interval_ms: { text: String(interval * 1000), value: String(interval * 1000) },
});
}
query.step = interval;

View File

@@ -68,7 +68,7 @@ describe('PrometheusDatasource', () => {
ctx.query = {
range: { from: moment(1443454528000), to: moment(1443454528000) },
targets: [{ expr: 'test{job="testjob"}', format: 'heatmap', legendFormat: '{{le}}' }],
interval: '60s',
interval: '1s',
};
});

View File

@@ -452,7 +452,7 @@ describe('PrometheusDatasource', function() {
interval: '10s',
scopedVars: {
__interval: { text: '10s', value: '10s' },
__interval_ms: { text: 10 * 1000, value: 10 * 1000 },
__interval_ms: { text: String(10 * 1000), value: String(10 * 1000) },
},
};
var urlExpected =
@@ -463,8 +463,8 @@ describe('PrometheusDatasource', function() {
expect(query.scopedVars.__interval.text).to.be('10s');
expect(query.scopedVars.__interval.value).to.be('10s');
expect(query.scopedVars.__interval_ms.text).to.be(10 * 1000);
expect(query.scopedVars.__interval_ms.value).to.be(10 * 1000);
expect(query.scopedVars.__interval_ms.text).to.be(String(10 * 1000));
expect(query.scopedVars.__interval_ms.value).to.be(String(10 * 1000));
});
it('should be min interval when it is greater than auto interval', function() {
var query = {
@@ -479,7 +479,7 @@ describe('PrometheusDatasource', function() {
interval: '5s',
scopedVars: {
__interval: { text: '5s', value: '5s' },
__interval_ms: { text: 5 * 1000, value: 5 * 1000 },
__interval_ms: { text: String(5 * 1000), value: String(5 * 1000) },
},
};
var urlExpected =
@@ -490,8 +490,8 @@ describe('PrometheusDatasource', function() {
expect(query.scopedVars.__interval.text).to.be('5s');
expect(query.scopedVars.__interval.value).to.be('5s');
expect(query.scopedVars.__interval_ms.text).to.be(5 * 1000);
expect(query.scopedVars.__interval_ms.value).to.be(5 * 1000);
expect(query.scopedVars.__interval_ms.text).to.be(String(5 * 1000));
expect(query.scopedVars.__interval_ms.value).to.be(String(5 * 1000));
});
it('should account for intervalFactor', function() {
var query = {
@@ -507,7 +507,7 @@ describe('PrometheusDatasource', function() {
interval: '10s',
scopedVars: {
__interval: { text: '10s', value: '10s' },
__interval_ms: { text: 10 * 1000, value: 10 * 1000 },
__interval_ms: { text: String(10 * 1000), value: String(10 * 1000) },
},
};
var urlExpected =
@@ -518,8 +518,8 @@ describe('PrometheusDatasource', function() {
expect(query.scopedVars.__interval.text).to.be('10s');
expect(query.scopedVars.__interval.value).to.be('10s');
expect(query.scopedVars.__interval_ms.text).to.be(10 * 1000);
expect(query.scopedVars.__interval_ms.value).to.be(10 * 1000);
expect(query.scopedVars.__interval_ms.text).to.be(String(10 * 1000));
expect(query.scopedVars.__interval_ms.value).to.be(String(10 * 1000));
});
it('should be interval * intervalFactor when greater than min interval', function() {
var query = {
@@ -535,7 +535,7 @@ describe('PrometheusDatasource', function() {
interval: '5s',
scopedVars: {
__interval: { text: '5s', value: '5s' },
__interval_ms: { text: 5 * 1000, value: 5 * 1000 },
__interval_ms: { text: String(5 * 1000), value: String(5 * 1000) },
},
};
var urlExpected =
@@ -546,8 +546,8 @@ describe('PrometheusDatasource', function() {
expect(query.scopedVars.__interval.text).to.be('5s');
expect(query.scopedVars.__interval.value).to.be('5s');
expect(query.scopedVars.__interval_ms.text).to.be(5 * 1000);
expect(query.scopedVars.__interval_ms.value).to.be(5 * 1000);
expect(query.scopedVars.__interval_ms.text).to.be(String(5 * 1000));
expect(query.scopedVars.__interval_ms.value).to.be(String(5 * 1000));
});
it('should be min interval when greater than interval * intervalFactor', function() {
var query = {
@@ -563,7 +563,7 @@ describe('PrometheusDatasource', function() {
interval: '5s',
scopedVars: {
__interval: { text: '5s', value: '5s' },
__interval_ms: { text: 5 * 1000, value: 5 * 1000 },
__interval_ms: { text: String(5 * 1000), value: String(5 * 1000) },
},
};
var urlExpected =
@@ -574,8 +574,8 @@ describe('PrometheusDatasource', function() {
expect(query.scopedVars.__interval.text).to.be('5s');
expect(query.scopedVars.__interval.value).to.be('5s');
expect(query.scopedVars.__interval_ms.text).to.be(5 * 1000);
expect(query.scopedVars.__interval_ms.value).to.be(5 * 1000);
expect(query.scopedVars.__interval_ms.text).to.be(String(5 * 1000));
expect(query.scopedVars.__interval_ms.value).to.be(String(5 * 1000));
});
it('should be determined by the 11000 data points limit, accounting for intervalFactor', function() {
var query = {
@@ -590,7 +590,7 @@ describe('PrometheusDatasource', function() {
interval: '5s',
scopedVars: {
__interval: { text: '5s', value: '5s' },
__interval_ms: { text: 5 * 1000, value: 5 * 1000 },
__interval_ms: { text: String(5 * 1000), value: String(5 * 1000) },
},
};
var end = 7 * 24 * 60 * 60;
@@ -609,8 +609,8 @@ describe('PrometheusDatasource', function() {
expect(query.scopedVars.__interval.text).to.be('5s');
expect(query.scopedVars.__interval.value).to.be('5s');
expect(query.scopedVars.__interval_ms.text).to.be(5 * 1000);
expect(query.scopedVars.__interval_ms.value).to.be(5 * 1000);
expect(query.scopedVars.__interval_ms.text).to.be(String(5 * 1000));
expect(query.scopedVars.__interval_ms.value).to.be(String(5 * 1000));
});
});
});

View File

@@ -127,4 +127,82 @@ describe('Prometheus Result Transformer', () => {
]);
});
});
describe('When resultFormat is time series', () => {
it('should transform matrix into timeseries', () => {
const response = {
status: 'success',
data: {
resultType: 'matrix',
result: [
{
metric: { __name__: 'test', job: 'testjob' },
values: [[0, '10'], [1, '10'], [2, '0']],
},
],
},
};
let result = [];
let options = {
format: 'timeseries',
start: 0,
end: 2,
};
ctx.resultTransformer.transform(result, { data: response }, options);
expect(result).toEqual([{ target: 'test{job="testjob"}', datapoints: [[10, 0], [10, 1000], [0, 2000]] }]);
});
it('should fill timeseries with null values', () => {
const response = {
status: 'success',
data: {
resultType: 'matrix',
result: [
{
metric: { __name__: 'test', job: 'testjob' },
values: [[1, '10'], [2, '0']],
},
],
},
};
let result = [];
let options = {
format: 'timeseries',
step: 1,
start: 0,
end: 2,
};
ctx.resultTransformer.transform(result, { data: response }, options);
expect(result).toEqual([{ target: 'test{job="testjob"}', datapoints: [[null, 0], [10, 1000], [0, 2000]] }]);
});
it('should align null values with step', () => {
const response = {
status: 'success',
data: {
resultType: 'matrix',
result: [
{
metric: { __name__: 'test', job: 'testjob' },
values: [[4, '10'], [8, '10']],
},
],
},
};
let result = [];
let options = {
format: 'timeseries',
step: 2,
start: 0,
end: 8,
};
ctx.resultTransformer.transform(result, { data: response }, options);
expect(result).toEqual([
{ target: 'test{job="testjob"}', datapoints: [[null, 0], [null, 2000], [10, 4000], [null, 6000], [10, 8000]] },
]);
});
});
});

View File

@@ -64,7 +64,8 @@ function graphDirective(timeSrv, popoverSrv, contextSrv) {
}
annotations = ctrl.annotations || [];
buildFlotPairs(data);
updateLegendValues(data, panel);
const graphHeight = elem.height();
updateLegendValues(data, panel, graphHeight);
ctrl.events.emit('render-legend');
});

View File

@@ -25,7 +25,7 @@
display: inline-block;
padding-right: 2px;
&::after {
content: " | ";
content: ' | ';
padding-left: 2px;
}
}
@@ -33,14 +33,23 @@
li:last-child {
&::after {
padding-left: 0;
content: "";
content: '';
}
}
}
.login-page {
.footer {
position: absolute;
bottom: $spacer;
padding: 1rem 0 1rem 0;
}
}
@include media-breakpoint-up(md) {
.login-page {
.footer {
bottom: $spacer;
position: absolute;
padding: 5rem 0 1rem 0;
}
}
}

View File

@@ -16,6 +16,7 @@ div.flot-text {
height: 100%;
&--solo {
margin: 0;
.panel-container {
border: none;
z-index: $zindex-sidemenu + 1;

View File

@@ -1,9 +1,8 @@
$login-border: #8daac5;
.login {
background-position: center;
min-height: 85vh;
height: 80vh;
background-position: center;
background-repeat: no-repeat;
min-width: 100%;
margin-left: 0;
@@ -95,7 +94,7 @@ select:-webkit-autofill:focus {
position: relative;
justify-content: center;
z-index: 1;
height: 320px;
min-height: 320px;
}
.login-branding {
@@ -106,6 +105,7 @@ select:-webkit-autofill:focus {
align-items: center;
justify-content: center;
flex-grow: 0;
padding-top: 2rem;
.logo-icon {
width: 70px;
@@ -127,7 +127,7 @@ select:-webkit-autofill:focus {
.login-inner-box {
text-align: center;
padding: 2rem 4rem;
padding: 2rem;
display: flex;
flex-direction: column;
align-items: center;
@@ -243,7 +243,7 @@ select:-webkit-autofill:focus {
justify-content: space-between;
.login-divider-line {
width: 110px;
width: 100px;
height: 10px;
border-bottom: 1px solid $login-border;
@@ -323,7 +323,10 @@ select:-webkit-autofill:focus {
width: 35%;
padding: 4rem 2rem;
border-right: 1px solid $login-border;
justify-content: flex-start;
.logo-icon {
width: 80px;
}
}
.login-inner-box {
@@ -331,14 +334,18 @@ select:-webkit-autofill:focus {
padding: 1rem 2rem;
}
.login-branding {
.logo-icon {
width: 80px;
.login-divider {
.login-divider-line {
width: 110px;
}
}
}
@include media-breakpoint-up(md) {
.login {
min-height: 100vh;
}
.login-content {
flex: 1 0 100%;
}
@@ -373,10 +380,6 @@ select:-webkit-autofill:focus {
}
@include media-breakpoint-up(lg) {
.login {
min-height: 100vh;
}
.login-form-input {
min-width: 300px;
}

View File

@@ -6,4 +6,7 @@ gpg --allow-secret-key-import --import ~/private-repo/signing/private.key
cp ./scripts/build/rpmmacros ~/.rpmmacros
./scripts/build/sign_expect $GPG_KEY_PASSWORD dist/*.rpm
for package in dist/*.rpm; do
[ -e "$package" ] || continue
./scripts/build/sign_expect $GPG_KEY_PASSWORD $package
done