org.apache.atlas.hive.bridge.HiveMetaStoreBridge imports the hive metadata into Atlas using the model defined in org.apache.atlas.hive.model.HiveDataModelGenerator. import-hive.sh command can be used to facilitate this.
Set-up the following configs in hive-site.xml of your hive set-up and set environment variable HIVE_CONF_DIR to the hive conf directory:
* Atlas endpoint - Add the following property with the Atlas endpoint for your set-up
<verbatim>
<property>
Set-up the following configs in <atlas-conf>/client.properties and set environment variable HIVE_CONFIG to the hive conf directory:
<verbatim>
<property>
<name>atlas.rest.address</name>
<value>http://localhost:21000/</value>
</property>
<property>
</property>
<property>
<name>atlas.cluster.name</name>
<value>primary</value>
</property>
</verbatim>
</property>
</verbatim>
Usage: <atlas package>/bin/hive/import-hive.sh. The logs are in <atlas package>/logs/import-hive.log
Usage: <atlas package>/bin/import-hive.sh. The logs are in <atlas package>/logs/import-hive.log
---++ Hive Hook
Hive supports listeners on hive command execution using hive hooks. This is used to add/update/remove entities in Atlas using the model defined in org.apache.atlas.hive.model.HiveDataModelGenerator.
The hook submits the request to a thread pool executor to avoid blocking the command execution. The thread submits the entities as message to the notification server and atlas server reads these messages and registers the entities.
Follow these instructions in your hive set-up to add hive hook for Atlas:
* Set-up atlas hook and atlas endpoint in hive-site.xml:
<verbatim>
<property>
* Set-up atlas hook in hive-site.xml of your hive configuration: