HookHive.md 7.37 KB
Newer Older
1 2
---
name: Hive
3
route: /HookHive
4 5 6
menu: Documentation
submenu: Hooks
---
7

8 9 10 11 12 13 14 15 16
import  themen  from 'theme/styles/styled-colors';
import  * as theme  from 'react-syntax-highlighter/dist/esm/styles/hljs';
import SyntaxHighlighter from 'react-syntax-highlighter';
import Img from 'theme/components/shared/Img'

# Apache Atlas Hook & Bridge for Apache Hive


## Hive Model
17 18
Hive model includes the following types:
   * Entity types:
19
    * hive_db
20
      * super-types: Asset
21 22
      * attributes: qualifiedName, name, description, owner, clusterName, location, parameters, ownerName
    * hive_table
23
         * super-types: DataSet
24 25
         * attributes: qualifiedName, name, description, owner, db, createTime, lastAccessTime, comment, retention, sd, partitionKeys, columns, aliases, parameters, viewOriginalText, viewExpandedText, tableType, temporary
      * hive_column
26
         * super-types: DataSet
27 28 29 30 31 32 33 34 35 36
         * attributes: qualifiedName, name, description, owner, type, comment, table
      * hive_storagedesc
         * super-types: Referenceable
         * attributes: qualifiedName, table, location, inputFormat, outputFormat, compressed, numBuckets, serdeInfo, bucketCols, sortCols, parameters, storedAsSubDirectories
      * hive_process
         * super-types: Process
         * attributes: qualifiedName, name, description, owner, inputs, outputs, startTime, endTime, userName, operationType, queryText, queryPlan, queryId, clusterName
      * hive_column_lineage
         * super-types: Process
         * attributes: qualifiedName, name, description, owner, inputs, outputs, query, depenendencyType, expression
37

38 39

   * Enum types:
40 41
    * hive_principal_type
      * values: USER, ROLE, GROUP
42

43 44

   * Struct types:
45
     * hive_order
46 47 48
         * attributes: col, order
      * hive_serde
         * attributes: name, serializationLib, parameters
49

50 51 52

Hive entities are created and de-duped in Atlas using unique attribute qualifiedName, whose value should be formatted as detailed below. Note that dbName, tableName and columnName should be in lower case.

53 54 55 56 57 58
<SyntaxHighlighter wrapLines={true} language="shell" style={theme.dark}>
{`hive_db.qualifiedName:     <dbName>@<clusterName>
hive_table.qualifiedName:  <dbName>.<tableName>@<clusterName>
hive_column.qualifiedName: <dbName>.<tableName>.<columnName>@<clusterName>
hive_process.queryString:  trimmed query string in lower case`}
</SyntaxHighlighter>
59

60 61

## Hive Hook
62 63
Atlas Hive hook registers with Hive to listen for create/update/delete operations and updates the metadata in Atlas, via Kafka notifications, for the changes in Hive.
Follow the instructions below to setup Atlas hook in Hive:
64
  * Set-up Atlas hook in hive-site.xml by adding the following:
65

66 67 68
<SyntaxHighlighter wrapLines={true} language="xml" style={theme.dark}>
{`<property>
    <name>hive.exec.post.hooks</name>
69
      <value>org.apache.atlas.hive.hook.HiveHook</value>
70 71
  </property>`}
</SyntaxHighlighter>
72

73 74 75 76 77
  * untar apache-atlas-${project.version}-hive-hook.tar.gz
  * cd apache-atlas-hive-hook-${project.version}
  * Copy entire contents of folder apache-atlas-hive-hook-${project.version}/hook/hive to `<atlas package>`/hook/hive
  * Add 'export HIVE_AUX_JARS_PATH=`<atlas package>`/hook/hive' in hive-env.sh of your hive configuration
  * Copy `<atlas-conf>`/atlas-application.properties to the hive conf directory.
78

79 80

The following properties in atlas-application.properties control the thread pool and notification details:
81 82 83

<SyntaxHighlighter wrapLines={true} language="shell" style={theme.dark}>
{`atlas.hook.hive.synchronous=false # whether to run the hook synchronously. false recommended to avoid delays in Hive query completion. Default: false
84 85 86 87 88 89
atlas.hook.hive.numRetries=3      # number of retries for notification failure. Default: 3
atlas.hook.hive.queueSize=10000   # queue size for the threadpool. Default: 10000
atlas.cluster.name=primary # clusterName to use in qualifiedName of entities. Default: primary
atlas.kafka.zookeeper.connect=                    # Zookeeper connect URL for Kafka. Example: localhost:2181
atlas.kafka.zookeeper.connection.timeout.ms=30000 # Zookeeper connection timeout. Default: 30000
atlas.kafka.zookeeper.session.timeout.ms=60000    # Zookeeper session timeout. Default: 60000
90 91
atlas.kafka.zookeeper.sync.time.ms=20             # Zookeeper sync time. Default: 20`}
</SyntaxHighlighter>
92

93
Other configurations for Kafka notification producer can be specified by prefixing the configuration name with "atlas.kafka.". For list of configuration supported by Kafka producer, please refer to [Kafka Producer Configs](http://kafka.apache.org/documentation/#producerconfigs)
94

95
## Column Level Lineage
96 97 98

Starting from 0.8-incubating version of Atlas, Column level lineage is captured in Atlas. Below are the details

99
### Model
100

101 102 103 104 105 106 107 108
* ColumnLineageProcess type is a subtype of Process
* This relates an output Column to a set of input Columns or the Input Table
* The lineage also captures the kind of dependency, as listed below:
   * SIMPLE:     output column has the same value as the input
   * EXPRESSION: output column is transformed by some expression at runtime (for e.g. a Hive SQL expression) on the Input Columns.
   * SCRIPT:     output column is transformed by a user provided script.
* In case of EXPRESSION dependency the expression attribute contains the expression in string form
* Since Process links input and output DataSets, Column is a subtype of DataSet
109

110
### Examples
111
For a simple CTAS below:
112 113 114 115

<SyntaxHighlighter wrapLines={true} language="sql" style={theme.dark}>
create table t2 as select id, name from T1
</SyntaxHighlighter>
116 117 118

The lineage is captured as

119
<Img src={`/images/column_lineage_ex1.png`} height="200" width="400" />
120 121 122



123
### Extracting Lineage from Hive commands
124 125 126

  * The HiveHook maps the LineageInfo in the HookContext to Column lineage instances
  * The LineageInfo in Hive provides column-level lineage for the final FileSinkOperator, linking them to the input columns in the Hive Query
127

128
## NOTES
129

130 131 132 133 134 135 136 137 138 139 140 141
   * Column level lineage works with Hive version 1.2.1 after the patch for <a href="https://issues.apache.org/jira/browse/HIVE-13112">HIVE-13112</a> is applied to Hive source
   * Since database name, table name and column names are case insensitive in hive, the corresponding names in entities are lowercase. So, any search APIs should use lowercase while querying on the entity names
   * The following hive operations are captured by hive hook currently
      * create database
      * create table/view, create table as select
      * load, import, export
      * DMLs (insert)
      * alter database
      * alter table (skewed table information, stored as, protection is not supported)
      * alter view


142
## Importing Hive Metadata
143 144 145 146
Apache Atlas provides a command-line utility, import-hive.sh, to import metadata of Apache Hive databases and tables into Apache Atlas.
This utility can be used to initialize Apache Atlas with databases/tables present in Apache Hive.
This utility supports importing metadata of a specific table, tables in a specific database or all databases and tables.

147 148
<SyntaxHighlighter wrapLines={true} language="shell" style={theme.dark}>
{`Usage 1: <atlas package>/hook-bin/import-hive.sh
149 150 151 152 153
Usage 2: <atlas package>/hook-bin/import-hive.sh [-d <database regex> OR --database <database regex>] [-t <table regex> OR --table <table regex>]
Usage 3: <atlas package>/hook-bin/import-hive.sh [-f <filename>]
           File Format:
             database1:tbl1
             database1:tbl2
154 155
             database2:tbl1`}
</SyntaxHighlighter>