Difference between revisions of "Snowplow (software)"

From Wiki @ Karl Jones dot com
Jump to: navigation, search
(See also)
(External links)
 
(2 intermediate revisions by the same user not shown)
Line 42: Line 42:
 
* EmrEtlRunner An application that parses logs from a Collector and stores enriched events to S3
 
* EmrEtlRunner An application that parses logs from a Collector and stores enriched events to S3
 
* Stream Enrich A Scala application that reads Thrift events from a Kinesis stream and outputs back to a Kinesis stream
 
* Stream Enrich A Scala application that reads Thrift events from a Kinesis stream and outputs back to a Kinesis stream
 +
 +
== EmrEtlRunner ==
 +
 +
'''Snowplow EmrEtlRunner''' is an application that parses the log files generated by your Snowplow collector and
 +
 +
* Cleans up the data into a format that is easier to parse / analyse
 +
* Enriches the data (e.g. infers the location of the visitor from his / her IP address and infers the search engine keywords from the query string)
 +
* Stores that cleaned, enriched data in S3
 +
 +
See:
 +
 +
* [https://github.com/snowplow/snowplow/tree/master/3-enrich/emr-etl-runner emr-etl-runner]
 +
* [https://github.com/snowplow/snowplow/wiki/setting-up-EmrEtlRunner Setting up EmrEtlRunner]
  
 
== Discourse forums ==
 
== Discourse forums ==
Line 64: Line 77:
 
* [http://stackoverflow.com/questions/37476726/snowplow-warning-no-tracker-configured Snowplow: Warning: No tracker configured] @ Stack Overflow - code example using callback.
 
* [http://stackoverflow.com/questions/37476726/snowplow-warning-no-tracker-configured Snowplow: Warning: No tracker configured] @ Stack Overflow - code example using callback.
 
* [https://github.com/snowplow/snowplow/wiki/2-Specific-event-tracking-with-the-Javascript-tracker-v2.5#custom-structured-events 2 Specific event tracking with the Javascript tracker v2.5]
 
* [https://github.com/snowplow/snowplow/wiki/2-Specific-event-tracking-with-the-Javascript-tracker-v2.5#custom-structured-events 2 Specific event tracking with the Javascript tracker v2.5]
 +
* [https://github.com/snowplow/snowplow/wiki/3-Advanced-usage-of-the-JavaScript-Tracker 3 Advanced usage of the JavaScript Tracker]
  
 
[[Category:Software]]
 
[[Category:Software]]
 
[[Category:Web design and development]]
 
[[Category:Web design and development]]

Latest revision as of 10:02, 15 September 2016

Snowplow is a marketing and product analytics platform.

Description

According to the official website, Snowplow does three things:

  • Identifies website users, and tracks the way they engage with a website or web application;
  • Stores users' behavioral data in a scalable "event data warehouse" you control: in Amazon S3 and (optionally) Amazon Redshift or Postgres;
  • Leverages the biggest range of tools to analyze that data, including big data tools (e.g. Hive, Pig, Mahout) via EMR or more traditional tools e.g. Tableau, R, Looker, Chartio to analyze that behavioral data.

Core concepts

Snowplow is built around the following core concepts:

  • Events
  • Dictionaries and schemas
  • Contexts
  • Iglu
  • Stages in the Snowplow data pipeline

Setting up Snowplow

The process of setting up Snowplow consists of:

  1. Set up a collector;
  2. Set up a tracker or webhook;
  3. Set up enrich;
  4. Set up alternative data stores.

Iglu repository

An Iglu repository acts as a store of data schemas (Snowplow, currently JSON Schemas only).

Hosting JSON Schemas in an Iglu repository allows you to use those schemas in Iglu-capable systems such as Snowplow.

Enrich applications

A Snowplow Enrich application processes data from a Snowplow Collector, and stores enriched data in a persistent database.

There are currently two Enrichment processes available for setup:

  • EmrEtlRunner An application that parses logs from a Collector and stores enriched events to S3
  • Stream Enrich A Scala application that reads Thrift events from a Kinesis stream and outputs back to a Kinesis stream

EmrEtlRunner

Snowplow EmrEtlRunner is an application that parses the log files generated by your Snowplow collector and

  • Cleans up the data into a format that is easier to parse / analyse
  • Enriches the data (e.g. infers the location of the visitor from his / her IP address and infers the search engine keywords from the query string)
  • Stores that cleaned, enriched data in S3

See:

Discourse forums

See:

http://discourse.snowplowanalytics.com/users/karl_jones/activity

See also

External links