Commit 99d51436 by Shwetha GS

ATLAS-1009 Source HIVE_HOME and HIVE_CONF_DIR from hive_env.sh (svimal2106 via shwethags)

parent fa9fbbd7
......@@ -31,15 +31,6 @@ done
BASEDIR=`dirname ${PRG}`
BASEDIR=`cd ${BASEDIR}/..;pwd`
if [ -z "$ATLAS_CONF" ]; then
ATLAS_CONF=${BASEDIR}/conf
fi
export ATLAS_CONF
if [ -f "${ATLAS_CONF}/atlas-env.sh" ]; then
. "${ATLAS_CONF}/atlas-env.sh"
fi
if test -z "${JAVA_HOME}"
then
JAVA_BIN=`which java`
......@@ -55,11 +46,8 @@ if [ ! -e "${JAVA_BIN}" ] || [ ! -e "${JAR_BIN}" ]; then
exit 1
fi
# Construct classpath using Atlas conf directory
# and jars from bridge/hive and hook/hive directories.
ATLASCPPATH="$ATLAS_CONF"
for i in "${BASEDIR}/hook/hive/"*.jar; do
# Construct Atlas classpath using jars from hook/hive/atlas-hive-plugin-impl/ directory.
for i in "${BASEDIR}/hook/hive/atlas-hive-plugin-impl/"*.jar; do
ATLASCPPATH="${ATLASCPPATH}:$i"
done
......@@ -72,23 +60,30 @@ TIME=`date +%Y%m%d%H%M%s`
#Add hive conf in classpath
if [ ! -z "$HIVE_CONF_DIR" ]; then
HIVE_CP=$HIVE_CONF_DIR
HIVE_CONF=$HIVE_CONF_DIR
elif [ ! -z "$HIVE_HOME" ]; then
HIVE_CP="$HIVE_HOME/conf"
HIVE_CONF="$HIVE_HOME/conf"
elif [ -e /etc/hive/conf ]; then
HIVE_CP="/etc/hive/conf"
HIVE_CONF="/etc/hive/conf"
else
echo "Could not find a valid HIVE configuration"
exit 1
fi
echo Using Hive configuration directory ["$HIVE_CP"]
echo Using Hive configuration directory ["$HIVE_CONF"]
if [ -f "${HIVE_CONF}/hive-env.sh" ]; then
. "${HIVE_CONF}/hive-env.sh"
fi
if [ -z "$HIVE_HOME" ]; then
echo "Please set HIVE_HOME to the root of Hive installation"
exit 1
fi
HIVE_CP="${HIVE_CONF}"
for i in "${HIVE_HOME}/lib/"*.jar; do
HIVE_CP="${HIVE_CP}:$i"
done
......
......@@ -21,16 +21,11 @@ The entities are created and de-duped using unique qualified name. They provide
---++ Importing Hive Metadata
org.apache.atlas.hive.bridge.HiveMetaStoreBridge imports the Hive metadata into Atlas using the model defined in org.apache.atlas.hive.model.HiveDataModelGenerator.
import-hive.sh command can be used to facilitate this.
The script needs Hadoop and Hive classpath jars.
org.apache.atlas.hive.bridge.HiveMetaStoreBridge imports the Hive metadata into Atlas using the model defined in org.apache.atlas.hive.model.HiveDataModelGenerator. import-hive.sh command can be used to facilitate this. The script needs Hadoop and Hive classpath jars.
* For Hadoop jars, please make sure that the environment variable HADOOP_CLASSPATH is set. Another way is to set HADOOP_HOME to point to root directory of your Hadoop installation
* Similarly, for Hive jars, set HIVE_HOME to the root of Hive installation
* Set environment variable HIVE_CONF_DIR to Hive configuration directory
* Copy <atlas-conf>/atlas-application.properties to the hive conf directory
<verbatim>
Usage: <atlas package>/bin/import-hive.sh
......
......@@ -6,6 +6,7 @@ INCOMPATIBLE CHANGES:
ALL CHANGES:
ATLAS-1009 Source HIVE_HOME and HIVE_CONF_DIR from hive_env.sh (svimal2106 via shwethags)
ATLAS-847 UI: Audit versioning does not paginate details from Atlas server (Kalyanikashikar via shwethags)
ATLAS-1004 Option to enable taxonomy feature (kevalbhatt18 via shwethags)
ATLAS-1003 DataSetLineageServiceTest, GraphBackedDiscoveryServiceTest, and GraphRepoMapperScaleTest failing in some environments (dkantor via shwethags)
......
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment