Anomaly detection on SQL data warehouses and databases

Overview

CueObserve Logo

Test Coverage License


With CueObserve, you can run anomaly detection on data in your SQL data warehouses and databases.

CueObserve

Getting Started

Install via Docker

docker run -p 3000:80 cuebook/cueobserve

Now visit http://localhost:3000 in your browser.

How it works

You write a SQL GROUP BY query, map its columns as dimensions and measures, and save it as a virtual Dataset.

Dataset SQL

Dataset Schema Map

You then define one or more anomaly detection jobs on the dataset.

Anomaly Definition

When an anomaly detection job runs, CueObserve does the following:

  1. Executes the SQL GROUP BY query on your data warehouse and stores the result as a Pandas dataframe.
  2. Generates one or more timeseries from the dataframe, as defined in your anomaly detection job.
  3. Generates a forecast for each timeseries using Prophet.
  4. Creates a visual card for each timeseries. Marks the card as an anomaly if the last data point is anomalous.

Features

  • Automated SQL to timeseries transformation.
  • Run anomaly detection on the aggregate metric or break it down by any dimension.
  • In-built Scheduler. CueObserve uses Celery as the executor and celery-beat as the scheduler.
  • Slack alerts when anomalies are detected. (coming soon)
  • Monitoring. Slack alert when a job fails. CueObserve maintains detailed logs. (coming soon)

Limitations

  • Currently supports Prophet for timeseries forecasting.
  • Not being built for real-time anomaly detection on streaming data.

Support

For general help using CueObserve, read the documentation, or go to Github Discussions.

To report a bug or request a feature, open an issue.

Contributing

We'd love contributions to CueObserve. Before you contribute, please first discuss the change you wish to make via an issue or a discussion. Contributors are expected to adhere to our code of conduct.

Comments
  • hourly error

    hourly error

    Discussed in https://github.com/cuebook/CueObserve/discussions/75

    Originally posted by jithendra945 August 5, 2021 image Im getting this error, when i am trying to run anomaly definition.

    Im not getting why it is having None in it.

    bug observe 
    opened by sachinkbansal 8
  • Clickhouse as datasource support

    Clickhouse as datasource support

    Describe the solution you'd like I'd love to give CueObserve a try but our warehouse is currently in MS SQL Server.

    Describe the solution you'd like Add ClickHouse as a supported data source.

    I will wait when https://github.com/cuebook/CueObserve/issues/52 will resolve and try to implements PR

    enhancement connection 
    opened by Slach 4
  • Support SQL Server Data Source

    Support SQL Server Data Source

    Is your feature request related to a problem? Please describe. I'd love to give CueObserve a try but our warehouse is currently in MS SQL Server.

    Describe the solution you'd like Add SQL Server as a supported data source.

    Additional context I'd be interested in making the necessary pull request, but I'd like some high level advice on what might be needed.

    Is it as simple as adding the necessary sqlserver.py in https://github.com/cuebook/CueObserve/tree/main/api/dbConnections ?

    enhancement connection 
    opened by adam133 4
  • Schedule Tasks more that or equal 6 are being struck lifetime in celery queue

    Schedule Tasks more that or equal 6 are being struck lifetime in celery queue

    Describe the bug Cueobserve scheduled tasks are getting being struck in celery queue and not completing(No error is thrown)

    image

    To Reproduce Steps to reproduce the behavior:

    1. create 6 different anomalies definitions
    2. create 5 min cron interval & wait for 5min to start the scheduled tasks

    th0se tasks are not completing even after 1 day.

    for debugging, I have tried executing 5 scheduled tasks and 1 Scheduled task separately(different cron intervals), its is working fine, but when the scheduled tasks are 6 with same cron interval those got struck and didn't finish

    Expected behavior It should complete the 6 accepted tasks and pull the next tasks

    Thanks

    bug observe 
    opened by itsmesrds 3
  • Docker Network Mode and port mapping

    Docker Network Mode and port mapping

    Is your feature request related to a problem? Please describe. The docker-compose files provided sets everything with network mode "host". For various reason this setting does not work in my environment.

    Describe the solution you'd like

    • Capacity to use port mapping within the docker-compose files.
    • Capacity to use a Docker Swarm
    observe 
    opened by adrien-ferrara 2
  • docker-compose start error

    docker-compose start error

    image

    got the result:

    /usr/local/lib/python2.7/dist-packages/paramiko/transport.py:33: CryptographyDeprecationWarning: Python 2 is no longer supported by the Python core team. Support for it is now deprecated in cryptography, and will be removed in the next release. from cryptography.hazmat.backends import default_backend Traceback (most recent call last): File "/usr/local/bin/docker-compose", line 7, in from compose.cli.main import main File "/usr/local/lib/python2.7/dist-packages/compose/cli/main.py", line 24, in from ..config import ConfigurationError File "/usr/local/lib/python2.7/dist-packages/compose/config/init.py", line 6, in from .config import ConfigurationError File "/usr/local/lib/python2.7/dist-packages/compose/config/config.py", line 51, in from .validation import match_named_volumes File "/usr/local/lib/python2.7/dist-packages/compose/config/validation.py", line 12, in from jsonschema import Draft4Validator File "/usr/local/lib/python2.7/dist-packages/jsonschema/init.py", line 21, in from jsonschema._types import TypeChecker File "/usr/local/lib/python2.7/dist-packages/jsonschema/_types.py", line 3, in from pyrsistent import pmap File "/usr/local/lib/python2.7/dist-packages/pyrsistent/init.py", line 3, in from pyrsistent._pmap import pmap, m, PMap File "/usr/local/lib/python2.7/dist-packages/pyrsistent/_pmap.py", line 98 ) from e ^ SyntaxError: invalid syntax

    opened by wdongdongde 1
  • Support generic rest end point for notification

    Support generic rest end point for notification

    Discussed in https://github.com/cuebook/CueObserve/discussions/129

    Originally posted by pjpringle August 30, 2021 Not everyone has slack especially in the work place environments. Provide support to plug in rest calls to notify of anomalies.

    • [x] Image bytes included in response json
    enhancement observe 
    opened by sachinkbansal 1
  • Failing task due to incomplete data

    Failing task due to incomplete data

    Sometimes Anomaly detection job fails or the RCA job fails due to fewer data in a few buckets in that case we don't even show the results for the other buckets.

        if not allTasksSucceeded:
            runStatusObj.status = ANOMALY_DETECTION_ERROR
    

    We have added such checks to the code. But I think this is not a good thing to do as the user is not aware of the reason for the failure. Better is to skip this bucket and show the result of the rest of the buckets.

    opened by AakarSharmaHME 3
  • Different Priority alerts on different slack channel

    Different Priority alerts on different slack channel

    Is your feature request related to a problem? Please describe. There are different kinds of datasets. Anomalies on some are a P0 alert and on some, they are not that urgent. On the same dataset also a certain threshold needs immediate attention while others do not that much.

    Describe the solution you'd like While configuring settings for the slack channel we should be able to configure the different priorities and their slack channel id. Then while creating anomaly definition we should mention the priority of the anomaly definition. Just like we do for schedules.

    opened by AakarSharmaHME 5
  • Docker Image Build fails on Dev compose

    Docker Image Build fails on Dev compose

    Describe the bug Failed to create a docker image using docker-compose -f docker-compose-dev.yml up -d . Build fails with error Service 'cueo-backend' failed to build : Build failed

    To Reproduce Steps to reproduce the behavior:

    1. Git clone this repo
    2. Run docker-compose -f docker-compose-dev.yml up -d

    Expected behavior CueO components image to be built successfully and CueObserve to start running on localhost:3000

    Additional context Will add the entire log in the file or snippet below.

    observe 
    opened by shreyas-dev 4
  • pyarrow not installed

    pyarrow not installed

    Discussed in https://github.com/cuebook/CueObserve/discussions/187

    Originally posted by ravi-jam October 16, 2021 Hi Folks,

    My team just deployed CueObserve to EC2. I Got this error when we tried to add a dataset

    {“log”:“2021-10-16 06:15:14,232 [dbConnections.bigquery:63] ERROR Can’t connect to db with this credentials The pyarrow library is not installed, please install pyarrow to use the to_arrow() function.\n”,“stream”:“stderr”,“time”:“2021-10-16T06:15:14.237031452Z”}

    Any help would be appreciated.

    bug 
    opened by sachinkbansal 0
  • Handle scenario where dataset's last data point is complete and must not be ignored

    Handle scenario where dataset's last data point is complete and must not be ignored

    Discussed in https://github.com/cuebook/CueObserve/discussions/172

    Originally posted by PhilippeDo October 8, 2021 I have now this error when I tried to use prophet

    {"d2b9be2a-6d27-4e61-a577-3d99892e94e0": {"dimVal": null, "error": "{\"message\": \"single positional indexer is out-of-bounds\", \"stackTrace\": \"Traceback (most recent call last):\\n File \\\"/code/ops/tasks/anomalyDetection.py\\\", line 56, in anomalyService\\n result = detect(df, granularity, detectionRuleType, anomalyDef)\\n File \\\"/code/ops/tasks/anomalyDetection.py\\\", line 29, in detect\\n return prophetDetect(df, granularity)\\n File \\\"/code/ops/tasks/detectionTypes/prophet.py\\\", line 47, in prophetDetect\\n lastISO = df.iloc[-1][\\\"ds\\\"]\\n File \\\"/opt/venv/lib/python3.7/site-packages/pandas/core/indexing.py\\\", line 895, in __getitem__\\n return self._getitem_axis(maybe_callable, axis=axis)\\n File \\\"/opt/venv/lib/python3.7/site-packages/pandas/core/indexing.py\\\", line 1501, in _getitem_axis\\n self._validate_integer(key, axis)\\n File \\\"/opt/venv/lib/python3.7/site-packages/pandas/core/indexing.py\\\", line 1444, in _validate_integer\\n raise IndexError(\\\"single positional indexer is out-of-bounds\\\")\\nIndexError: single positional indexer is out-of-bounds\\n\"}", "success": false}}

    I used the following in DATASET section

    select DATE_FORMAT(date, '%Y-%m-%d %H') as Date,
    pagelt from pageloadtime2
    

    and then when I run, the dataset looks liks this. It is a minimal dataset withou dimension, with only a measure and a timestamp

    image

    enhancement anomalyDefinition 
    opened by sachinkbansal 0
Releases(v0.3.2)
Monty, Mongo tinified. MongoDB implemented in Python !

Monty, Mongo tinified. MongoDB implemented in Python ! Inspired by TinyDB and it's extension TinyMongo. MontyDB is: A tiny version of MongoDB, against

David Lai 522 Jan 01, 2023
Pure-python PostgreSQL driver

pg-purepy pg-purepy is a pure-Python PostgreSQL wrapper based on the anyio library. A lot of this library was inspired by the pg8000 library. Credits

Lura Skye 11 May 23, 2022
AWS SDK for Python

Boto3 - The AWS SDK for Python Boto3 is the Amazon Web Services (AWS) Software Development Kit (SDK) for Python, which allows Python developers to wri

the boto project 7.8k Jan 04, 2023
Some scripts for microsoft SQL server in old version.

MSSQL_Stuff Some scripts for microsoft SQL server which is in old version. Table of content Overview Usage References Overview These script works when

小离 5 Dec 29, 2022
Py2neo is a comprehensive toolkit for working with Neo4j from within Python applications or from the command line.

Py2neo v3 Py2neo is a client library and toolkit for working with Neo4j from within Python applications and from the command line. The core library ha

64 Oct 14, 2022
PostgreSQL database access simplified

Queries: PostgreSQL Simplified Queries is a BSD licensed opinionated wrapper of the psycopg2 library for interacting with PostgreSQL. The popular psyc

Gavin M. Roy 251 Oct 25, 2022
Familiar asyncio ORM for python, built with relations in mind

Tortoise ORM Introduction Tortoise ORM is an easy-to-use asyncio ORM (Object Relational Mapper) inspired by Django. Tortoise ORM was build with relati

Tortoise 3.3k Dec 31, 2022
The JavaScript Database, for Node.js, nw.js, electron and the browser

The JavaScript Database Embedded persistent or in memory database for Node.js, nw.js, Electron and browsers, 100% JavaScript, no binary dependency. AP

Louis Chatriot 13.2k Jan 02, 2023
A CRUD and REST api with mongodb atlas.

Movies_api A CRUD and REST api with mongodb atlas. Setup First import all the python dependencies in your virtual environment or globally by the follo

Pratyush Kongalla 0 Nov 09, 2022
Asynchronous interface for peewee ORM powered by asyncio

peewee-async Asynchronous interface for peewee ORM powered by asyncio. Important notes Since version 0.6.0a only peewee 3.5+ is supported If you still

05Bit 666 Dec 30, 2022
MongoX is an async python ODM for MongoDB which is built on top Motor and Pydantic.

MongoX MongoX is an async python ODM (Object Document Mapper) for MongoDB which is built on top Motor and Pydantic. The main features include: Fully t

Amin Alaee 112 Dec 04, 2022
Records is a very simple, but powerful, library for making raw SQL queries to most relational databases.

Records: SQL for Humans™ Records is a very simple, but powerful, library for making raw SQL queries to most relational databases. Just write SQL. No b

Kenneth Reitz 6.9k Jan 03, 2023
Asynchronous Python client for InfluxDB

aioinflux Asynchronous Python client for InfluxDB. Built on top of aiohttp and asyncio. Aioinflux is an alternative to the official InfluxDB Python cl

Gustavo Bezerra 159 Dec 27, 2022
A Python library for Cloudant and CouchDB

Cloudant Python Client This is the official Cloudant library for Python. Installation and Usage Getting Started API Reference Related Documentation De

Cloudant 162 Dec 19, 2022
A wrapper for SQLite and MySQL, Most of the queries wrapped into commands for ease.

Before you proceed, make sure you know Some real SQL, before looking at the code, otherwise you probably won't understand anything. Installation pip i

Refined 4 Jul 30, 2022
Import entity definition document into SQLie3. Manage the entity. Also, create a "Create Table SQL file".

EntityDocumentMaker Version 1.00 After importing the entity definition (Excel file), store the data in sqlite3. エンティティ定義(Excelファイル)をインポートした後、データをsqlit

G-jon FujiYama 1 Jan 09, 2022
Redis Python Client

redis-py The Python interface to the Redis key-value store. Python 2 Compatibility Note redis-py 3.5.x will be the last version of redis-py that suppo

Andy McCurdy 11k Dec 29, 2022
Simple DDL Parser to parse SQL (HQL, TSQL, AWS Redshift, Snowflake and other dialects) ddl files to json/python dict with full information about columns: types, defaults, primary keys, etc.

Simple DDL Parser Build with ply (lex & yacc in python). A lot of samples in 'tests/. Is it Stable? Yes, library already has about 5000+ usage per day

Iuliia Volkova 95 Jan 05, 2023
Logica is a logic programming language that compiles to StandardSQL and runs on Google BigQuery.

Logica: language of Big Data Logica is an open source declarative logic programming language for data manipulation. Logica is a successor to Yedalog,

Evgeny Skvortsov 1.5k Dec 30, 2022
Python DBAPI simplified

Facata A Python library that provides a simplified alternative to DBAPI 2. It provides a facade in front of DBAPI 2 drivers. Table of Contents Install

Tony Locke 44 Nov 17, 2021