Skip to content
Projects
Groups
Snippets
Help
This project
Loading...
Sign in / Register
Toggle navigation
A
atlas
Project
Overview
Details
Activity
Cycle Analytics
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Charts
Issues
0
Issues
0
List
Board
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Charts
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Charts
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
dataplatform
atlas
Commits
5e815267
Commit
5e815267
authored
May 20, 2015
by
Shwetha GS
Browse files
Options
Browse Files
Download
Plain Diff
Merge branch 'master' into dal
parents
77797892
2f8fc054
Hide whitespace changes
Inline
Side-by-side
Showing
33 changed files
with
400 additions
and
319 deletions
+400
-319
InstallationSteps.txt
InstallationSteps.txt
+2
-2
pom.xml
addons/hive-bridge/pom.xml
+58
-22
import-hive.sh
addons/hive-bridge/src/bin/import-hive.sh
+1
-0
HiveMetaStoreBridge.java
...ache/hadoop/metadata/hive/bridge/HiveMetaStoreBridge.java
+22
-21
HiveHook.java
...n/java/org/apache/hadoop/metadata/hive/hook/HiveHook.java
+1
-1
HiveDataModelGenerator.java
...he/hadoop/metadata/hive/model/HiveDataModelGenerator.java
+5
-5
Bridge-Hive.twiki
addons/hive-bridge/src/site/twiki/Bridge-Hive.twiki
+3
-3
HiveHookIT.java
...java/org/apache/hadoop/metadata/hive/hook/HiveHookIT.java
+1
-1
SSLAndKerberosHiveHookIT.java
...e/hadoop/metadata/hive/hook/SSLAndKerberosHiveHookIT.java
+1
-1
SSLHiveHookIT.java
...a/org/apache/hadoop/metadata/hive/hook/SSLHiveHookIT.java
+1
-1
MetadataServiceClient.java
...ava/org/apache/hadoop/metadata/MetadataServiceClient.java
+24
-18
pom.xml
pom.xml
+3
-1
HiveLineageService.java
.../apache/hadoop/metadata/discovery/HiveLineageService.java
+22
-19
GraphBackedDiscoveryService.java
...metadata/discovery/graph/GraphBackedDiscoveryService.java
+1
-1
GraphBackedMetadataRepository.java
...adata/repository/graph/GraphBackedMetadataRepository.java
+18
-9
GraphBackedSearchIndexer.java
...p/metadata/repository/graph/GraphBackedSearchIndexer.java
+14
-5
TitanGraphProvider.java
.../hadoop/metadata/repository/graph/TitanGraphProvider.java
+5
-0
GraphBackedTypeStore.java
...p/metadata/repository/typestore/GraphBackedTypeStore.java
+45
-38
DefaultMetadataService.java
...ache/hadoop/metadata/services/DefaultMetadataService.java
+18
-8
MetadataService.java
.../org/apache/hadoop/metadata/services/MetadataService.java
+2
-1
GraphBackedDiscoveryServiceTest.java
...p/metadata/discovery/GraphBackedDiscoveryServiceTest.java
+2
-33
GraphBackedMetadataRepositoryTest.java
...a/repository/graph/GraphBackedMetadataRepositoryTest.java
+35
-3
application.properties
src/conf/application.properties
+4
-3
TypeSystem.java
...g/apache/hadoop/metadata/typesystem/types/TypeSystem.java
+15
-10
TypeSystemTest.java
...ache/hadoop/metadata/typesystem/types/TypeSystemTest.java
+1
-1
GuiceServletConfig.java
...che/hadoop/metadata/web/listeners/GuiceServletConfig.java
+6
-16
EntityResource.java
.../apache/hadoop/metadata/web/resources/EntityResource.java
+10
-21
HiveLineageResource.java
...he/hadoop/metadata/web/resources/HiveLineageResource.java
+6
-9
TypesResource.java
...g/apache/hadoop/metadata/web/resources/TypesResource.java
+21
-37
BaseResourceIT.java
.../apache/hadoop/metadata/web/resources/BaseResourceIT.java
+8
-3
EntityJerseyResourceIT.java
...hadoop/metadata/web/resources/EntityJerseyResourceIT.java
+28
-15
HiveLineageJerseyResourceIT.java
...p/metadata/web/resources/HiveLineageJerseyResourceIT.java
+11
-6
TypesJerseyResourceIT.java
.../hadoop/metadata/web/resources/TypesJerseyResourceIT.java
+6
-5
No files found.
InstallationSteps.txt
View file @
5e815267
...
@@ -87,11 +87,11 @@ c. Using DGI
...
@@ -87,11 +87,11 @@ c. Using DGI
{"Version":"v0.1"}
{"Version":"v0.1"}
* List the types in the repository
* List the types in the repository
curl -v http://localhost:21000/api/metadata/types
/list
curl -v http://localhost:21000/api/metadata/types
{"list":["biginteger","short","byte","int","string","bigdecimal","boolean","date","double","long","float"],"requestId":"902580786@qtp-1479771328-0"}
{"list":["biginteger","short","byte","int","string","bigdecimal","boolean","date","double","long","float"],"requestId":"902580786@qtp-1479771328-0"}
* List the instances for a given type
* List the instances for a given type
curl -v http://localhost:21000/api/metadata/entities
/list/
hive_table
curl -v http://localhost:21000/api/metadata/entities
?type=
hive_table
{"requestId":"788558007@qtp-44808654-5","list":["cb9b5513-c672-42cb-8477-b8f3e537a162","ec985719-a794-4c98-b98f-0509bd23aac0","48998f81-f1d3-45a2-989a-223af5c1ed6e","a54b386e-c759-4651-8779-a099294244c4"]}
{"requestId":"788558007@qtp-44808654-5","list":["cb9b5513-c672-42cb-8477-b8f3e537a162","ec985719-a794-4c98-b98f-0509bd23aac0","48998f81-f1d3-45a2-989a-223af5c1ed6e","a54b386e-c759-4651-8779-a099294244c4"]}
curl -v http://localhost:21000/api/metadata/entities/list/hive_db
curl -v http://localhost:21000/api/metadata/entities/list/hive_db
...
...
addons/hive-bridge/pom.xml
View file @
5e815267
...
@@ -138,7 +138,7 @@
...
@@ -138,7 +138,7 @@
<version>
2.10
</version>
<version>
2.10
</version>
<executions>
<executions>
<execution>
<execution>
<id>
copy-dependencies
</id>
<id>
copy-
bridge-
dependencies
</id>
<phase>
package
</phase>
<phase>
package
</phase>
<goals>
<goals>
<goal>
copy-dependencies
</goal>
<goal>
copy-dependencies
</goal>
...
@@ -152,48 +152,84 @@
...
@@ -152,48 +152,84 @@
</configuration>
</configuration>
</execution>
</execution>
<execution>
<execution>
<id>
copy
-hook-dependencies
</id>
<id>
copy
</id>
<phase>
package
</phase>
<phase>
package
</phase>
<goals>
<goals>
<goal>
copy
-dependencies
</goal>
<goal>
copy
</goal>
</goals>
</goals>
<configuration>
<configuration>
<outputDirectory>
${project.build.directory}/dependency/hook/hive
</outputDirectory>
<artifactItems>
<includeScope>
runtime
</includeScope>
<artifactItem>
<overWriteReleases>
false
</overWriteReleases>
<groupId>
${project.groupId}
</groupId>
<overWriteSnapshots>
false
</overWriteSnapshots>
<artifactId>
${project.artifactId}
</artifactId>
<overWriteIfNewer>
true
</overWriteIfNewer>
<version>
${project.version}
</version>
<overWrite>
true
</overWrite>
<outputDirectory>
${project.build.directory}/dependency/hook/hive
</outputDirectory>
</artifactItem>
</artifactItems>
</configuration>
</configuration>
</execution>
</execution>
</executions>
</plugin>
<plugin>
<groupId>
org.apache.maven.plugins
</groupId>
<artifactId>
maven-dependency-plugin
</artifactId>
<version>
2.10
</version>
<executions>
<execution>
<execution>
<id>
copy
</id>
<id>
copy
-hook-dependencies
</id>
<phase>
package
</phase>
<phase>
package
</phase>
<goals>
<goals>
<goal>
copy
</goal>
<goal>
copy
</goal>
</goals>
</goals>
<configuration>
<configuration>
<outputDirectory>
${project.build.directory}/dependency/hook/hive
</outputDirectory>
<overWriteReleases>
false
</overWriteReleases>
<overWriteSnapshots>
false
</overWriteSnapshots>
<overWriteIfNewer>
true
</overWriteIfNewer>
<artifactItems>
<artifactItems>
<artifactItem>
<artifactItem>
<groupId>
${project.groupId}
</groupId>
<groupId>
${project.groupId}
</groupId>
<artifactId>
${project.artifactId}
</artifactId>
<artifactId>
${project.artifactId}
</artifactId>
<version>
${project.version}
</version>
<version>
${project.version}
</version>
<overWrite>
true
</overWrite>
</artifactItem>
<outputDirectory>
${project.build.directory}/dependency/bridge/hive
</outputDirectory>
<artifactItem>
<groupId>
org.json4s
</groupId>
<artifactId>
json4s-native_2.10
</artifactId>
<version>
${json.version}
</version>
</artifactItem>
<artifactItem>
<groupId>
org.json4s
</groupId>
<artifactId>
json4s-core_2.10
</artifactId>
<version>
${json.version}
</version>
</artifactItem>
<artifactItem>
<groupId>
org.json4s
</groupId>
<artifactId>
json4s-ast_2.10
</artifactId>
<version>
${json.version}
</version>
</artifactItem>
</artifactItem>
<artifactItem>
<artifactItem>
<groupId>
${project.groupId}
</groupId>
<groupId>
${project.groupId}
</groupId>
<artifactId>
${project.artifactId}
</artifactId>
<artifactId>
metadata-client
</artifactId>
<version>
${project.version}
</version>
<version>
${project.version}
</version>
<overWrite>
true
</overWrite>
</artifactItem>
<outputDirectory>
${project.build.directory}/dependency/hook/hive
</outputDirectory>
<artifactItem>
<groupId>
${project.groupId}
</groupId>
<artifactId>
metadata-typesystem
</artifactId>
<version>
${project.version}
</version>
</artifactItem>
<artifactItem>
<groupId>
org.scala-lang
</groupId>
<artifactId>
scala-compiler
</artifactId>
<version>
${scala.version}
</version>
</artifactItem>
<artifactItem>
<groupId>
org.scala-lang
</groupId>
<artifactId>
scala-reflect
</artifactId>
<version>
${scala.version}
</version>
</artifactItem>
<artifactItem>
<groupId>
org.scala-lang
</groupId>
<artifactId>
scala-library
</artifactId>
<version>
${scala.version}
</version>
</artifactItem>
<artifactItem>
<groupId>
org.scala-lang
</groupId>
<artifactId>
scalap
</artifactId>
<version>
${scala.version}
</version>
</artifactItem>
</artifactItem>
</artifactItems>
</artifactItems>
</configuration>
</configuration>
...
...
addons/hive-bridge/src/bin/import-hive.sh
View file @
5e815267
...
@@ -85,6 +85,7 @@ else
...
@@ -85,6 +85,7 @@ else
fi
fi
export
HIVE_CP
export
HIVE_CP
echo
Using Hive configuration directory
[
$HIVE_CP
]
echo
Using Hive configuration directory
[
$HIVE_CP
]
echo
"Logs for import are in
$METADATA_LOG_DIR
/import-hive.log"
${
JAVA_BIN
}
${
JAVA_PROPERTIES
}
-cp
${
HIVE_CP
}
:
${
METADATACPPATH
}
org.apache.hadoop.metadata.hive.bridge.HiveMetaStoreBridge
${
JAVA_BIN
}
${
JAVA_PROPERTIES
}
-cp
${
HIVE_CP
}
:
${
METADATACPPATH
}
org.apache.hadoop.metadata.hive.bridge.HiveMetaStoreBridge
...
...
addons/hive-bridge/src/main/java/org/apache/hadoop/metadata/hive/bridge/HiveMetaStoreBridge.java
View file @
5e815267
...
@@ -40,6 +40,7 @@ import org.apache.hadoop.metadata.typesystem.json.Serialization;
...
@@ -40,6 +40,7 @@ import org.apache.hadoop.metadata.typesystem.json.Serialization;
import
org.apache.hadoop.metadata.typesystem.persistence.Id
;
import
org.apache.hadoop.metadata.typesystem.persistence.Id
;
import
org.apache.hadoop.metadata.typesystem.types.TypeSystem
;
import
org.apache.hadoop.metadata.typesystem.types.TypeSystem
;
import
org.codehaus.jettison.json.JSONArray
;
import
org.codehaus.jettison.json.JSONArray
;
import
org.codehaus.jettison.json.JSONException
;
import
org.codehaus.jettison.json.JSONObject
;
import
org.codehaus.jettison.json.JSONObject
;
import
org.slf4j.Logger
;
import
org.slf4j.Logger
;
import
org.slf4j.LoggerFactory
;
import
org.slf4j.LoggerFactory
;
...
@@ -119,8 +120,8 @@ public class HiveMetaStoreBridge {
...
@@ -119,8 +120,8 @@ public class HiveMetaStoreBridge {
if
(
results
.
length
()
==
0
)
{
if
(
results
.
length
()
==
0
)
{
return
null
;
return
null
;
}
else
{
}
else
{
ITypedReferenceableInstance
reference
=
Serialization
.
fromJson
(
results
.
get
(
0
).
toString
(
));
String
guid
=
getGuidFromDSLResponse
(
results
.
getJSONObject
(
0
));
return
new
Referenceable
(
reference
.
getId
().
id
,
typeName
,
null
);
return
new
Referenceable
(
gu
id
,
typeName
,
null
);
}
}
}
}
...
@@ -192,19 +193,22 @@ public class HiveMetaStoreBridge {
...
@@ -192,19 +193,22 @@ public class HiveMetaStoreBridge {
//todo DSL support for reference doesn't work. is the usage right?
//todo DSL support for reference doesn't work. is the usage right?
// String query = String.format("%s where dbName = \"%s\" and tableName = \"%s\"", typeName, dbRef.getId().id,
// String query = String.format("%s where dbName = \"%s\" and tableName = \"%s\"", typeName, dbRef.getId().id,
// tableName);
// tableName);
String
query
=
String
.
format
(
"%s where
tableN
ame = \"%s\""
,
typeName
,
tableName
);
String
query
=
String
.
format
(
"%s where
n
ame = \"%s\""
,
typeName
,
tableName
);
JSONArray
results
=
dgiClient
.
searchByDSL
(
query
);
JSONArray
results
=
dgiClient
.
searchByDSL
(
query
);
if
(
results
.
length
()
==
0
)
{
if
(
results
.
length
()
==
0
)
{
return
null
;
return
null
;
}
else
{
}
else
{
//There should be just one instance with the given name
//There should be just one instance with the given name
ITypedReferenceableInstance
reference
=
Serialization
.
fromJson
(
results
.
get
(
0
).
toString
());
String
guid
=
getGuidFromDSLResponse
(
results
.
getJSONObject
(
0
));
String
guid
=
reference
.
getId
().
id
;
LOG
.
debug
(
"Got reference for table {}.{} = {}"
,
dbRef
,
tableName
,
guid
);
LOG
.
debug
(
"Got reference for table {}.{} = {}"
,
dbRef
,
tableName
,
guid
);
return
new
Referenceable
(
guid
,
typeName
,
null
);
return
new
Referenceable
(
guid
,
typeName
,
null
);
}
}
}
}
private
String
getGuidFromDSLResponse
(
JSONObject
jsonObject
)
throws
JSONException
{
return
jsonObject
.
getJSONObject
(
"$id$"
).
getString
(
"id"
);
}
private
Referenceable
getSDForTable
(
Referenceable
dbRef
,
String
tableName
)
throws
Exception
{
private
Referenceable
getSDForTable
(
Referenceable
dbRef
,
String
tableName
)
throws
Exception
{
Referenceable
tableRef
=
getTableReference
(
dbRef
,
tableName
);
Referenceable
tableRef
=
getTableReference
(
dbRef
,
tableName
);
if
(
tableRef
==
null
)
{
if
(
tableRef
==
null
)
{
...
@@ -212,7 +216,7 @@ public class HiveMetaStoreBridge {
...
@@ -212,7 +216,7 @@ public class HiveMetaStoreBridge {
}
}
MetadataServiceClient
dgiClient
=
getMetadataServiceClient
();
MetadataServiceClient
dgiClient
=
getMetadataServiceClient
();
ITypedReferenceableInstanc
e
tableInstance
=
dgiClient
.
getEntity
(
tableRef
.
getId
().
id
);
Referenceabl
e
tableInstance
=
dgiClient
.
getEntity
(
tableRef
.
getId
().
id
);
Id
sdId
=
(
Id
)
tableInstance
.
get
(
"sd"
);
Id
sdId
=
(
Id
)
tableInstance
.
get
(
"sd"
);
return
new
Referenceable
(
sdId
.
id
,
sdId
.
getTypeName
(),
null
);
return
new
Referenceable
(
sdId
.
id
,
sdId
.
getTypeName
(),
null
);
}
}
...
@@ -223,6 +227,7 @@ public class HiveMetaStoreBridge {
...
@@ -223,6 +227,7 @@ public class HiveMetaStoreBridge {
}
}
public
Referenceable
registerTable
(
Referenceable
dbReference
,
String
dbName
,
String
tableName
)
throws
Exception
{
public
Referenceable
registerTable
(
Referenceable
dbReference
,
String
dbName
,
String
tableName
)
throws
Exception
{
LOG
.
info
(
"Attempting to register table ["
+
tableName
+
"]"
);
Referenceable
tableRef
=
getTableReference
(
dbReference
,
tableName
);
Referenceable
tableRef
=
getTableReference
(
dbReference
,
tableName
);
if
(
tableRef
==
null
)
{
if
(
tableRef
==
null
)
{
LOG
.
info
(
"Importing objects from "
+
dbName
+
"."
+
tableName
);
LOG
.
info
(
"Importing objects from "
+
dbName
+
"."
+
tableName
);
...
@@ -230,7 +235,7 @@ public class HiveMetaStoreBridge {
...
@@ -230,7 +235,7 @@ public class HiveMetaStoreBridge {
Table
hiveTable
=
hiveClient
.
getTable
(
dbName
,
tableName
);
Table
hiveTable
=
hiveClient
.
getTable
(
dbName
,
tableName
);
tableRef
=
new
Referenceable
(
HiveDataTypes
.
HIVE_TABLE
.
getName
());
tableRef
=
new
Referenceable
(
HiveDataTypes
.
HIVE_TABLE
.
getName
());
tableRef
.
set
(
"
tableN
ame"
,
hiveTable
.
getTableName
());
tableRef
.
set
(
"
n
ame"
,
hiveTable
.
getTableName
());
tableRef
.
set
(
"owner"
,
hiveTable
.
getOwner
());
tableRef
.
set
(
"owner"
,
hiveTable
.
getOwner
());
//todo fix
//todo fix
tableRef
.
set
(
"createTime"
,
hiveTable
.
getLastAccessTime
());
tableRef
.
set
(
"createTime"
,
hiveTable
.
getLastAccessTime
());
...
@@ -274,8 +279,8 @@ public class HiveMetaStoreBridge {
...
@@ -274,8 +279,8 @@ public class HiveMetaStoreBridge {
tableRef
.
set
(
"tableType"
,
hiveTable
.
getTableType
());
tableRef
.
set
(
"tableType"
,
hiveTable
.
getTableType
());
tableRef
.
set
(
"temporary"
,
hiveTable
.
isTemporary
());
tableRef
.
set
(
"temporary"
,
hiveTable
.
isTemporary
());
// List<Referenceable> fieldsList = getColumns(storageDesc
);
List
<
Referenceable
>
colList
=
getColumns
(
hiveTable
.
getAllCols
()
);
// tableRef.set("columns", fields
List);
tableRef
.
set
(
"columns"
,
col
List
);
tableRef
=
createInstance
(
tableRef
);
tableRef
=
createInstance
(
tableRef
);
}
else
{
}
else
{
...
@@ -397,7 +402,7 @@ public class HiveMetaStoreBridge {
...
@@ -397,7 +402,7 @@ public class HiveMetaStoreBridge {
}
}
*/
*/
List
<
Referenceable
>
fieldsList
=
getColumns
(
storageDesc
);
List
<
Referenceable
>
fieldsList
=
getColumns
(
storageDesc
.
getCols
()
);
sdReferenceable
.
set
(
"cols"
,
fieldsList
);
sdReferenceable
.
set
(
"cols"
,
fieldsList
);
List
<
Struct
>
sortColsStruct
=
new
ArrayList
<>();
List
<
Struct
>
sortColsStruct
=
new
ArrayList
<>();
...
@@ -428,19 +433,19 @@ public class HiveMetaStoreBridge {
...
@@ -428,19 +433,19 @@ public class HiveMetaStoreBridge {
return
createInstance
(
sdReferenceable
);
return
createInstance
(
sdReferenceable
);
}
}
private
List
<
Referenceable
>
getColumns
(
StorageDescriptor
storageDesc
)
throws
Exception
{
private
List
<
Referenceable
>
getColumns
(
List
<
FieldSchema
>
schemaList
)
throws
Exception
List
<
Referenceable
>
fieldsList
=
new
ArrayList
<>();
{
Referenceable
colReferenceable
;
List
<
Referenceable
>
colList
=
new
ArrayList
<>()
;
for
(
FieldSchema
fs
:
s
torageDesc
.
getCols
()
)
{
for
(
FieldSchema
fs
:
s
chemaList
)
{
LOG
.
debug
(
"Processing field "
+
fs
);
LOG
.
debug
(
"Processing field "
+
fs
);
colReferenceable
=
new
Referenceable
(
HiveDataTypes
.
HIVE_COLUMN
.
getName
());
Referenceable
colReferenceable
=
new
Referenceable
(
HiveDataTypes
.
HIVE_COLUMN
.
getName
());
colReferenceable
.
set
(
"name"
,
fs
.
getName
());
colReferenceable
.
set
(
"name"
,
fs
.
getName
());
colReferenceable
.
set
(
"type"
,
fs
.
getType
());
colReferenceable
.
set
(
"type"
,
fs
.
getType
());
colReferenceable
.
set
(
"comment"
,
fs
.
getComment
());
colReferenceable
.
set
(
"comment"
,
fs
.
getComment
());
fields
List
.
add
(
createInstance
(
colReferenceable
));
col
List
.
add
(
createInstance
(
colReferenceable
));
}
}
return
fields
List
;
return
col
List
;
}
}
public
synchronized
void
registerHiveDataModel
()
throws
Exception
{
public
synchronized
void
registerHiveDataModel
()
throws
Exception
{
...
@@ -454,10 +459,6 @@ public class HiveMetaStoreBridge {
...
@@ -454,10 +459,6 @@ public class HiveMetaStoreBridge {
}
else
{
}
else
{
LOG
.
info
(
"Hive data model is already registered!"
);
LOG
.
info
(
"Hive data model is already registered!"
);
}
}
//todo remove when fromJson(entityJson) is supported on client
dataModelGenerator
.
createDataModel
();
TypeSystem
.
getInstance
().
defineTypes
(
dataModelGenerator
.
getTypesDef
());
}
}
public
static
void
main
(
String
[]
argv
)
throws
Exception
{
public
static
void
main
(
String
[]
argv
)
throws
Exception
{
...
...
addons/hive-bridge/src/main/java/org/apache/hadoop/metadata/hive/hook/HiveHook.java
View file @
5e815267
...
@@ -246,7 +246,7 @@ public class HiveHook implements ExecuteWithHookContext, HiveSemanticAnalyzerHoo
...
@@ -246,7 +246,7 @@ public class HiveHook implements ExecuteWithHookContext, HiveSemanticAnalyzerHoo
LOG
.
debug
(
"Registering CTAS query: {}"
,
queryStr
);
LOG
.
debug
(
"Registering CTAS query: {}"
,
queryStr
);
Referenceable
processReferenceable
=
new
Referenceable
(
HiveDataTypes
.
HIVE_PROCESS
.
getName
());
Referenceable
processReferenceable
=
new
Referenceable
(
HiveDataTypes
.
HIVE_PROCESS
.
getName
());
processReferenceable
.
set
(
"
processN
ame"
,
operation
.
getOperationName
());
processReferenceable
.
set
(
"
n
ame"
,
operation
.
getOperationName
());
processReferenceable
.
set
(
"startTime"
,
queryStartTime
);
processReferenceable
.
set
(
"startTime"
,
queryStartTime
);
processReferenceable
.
set
(
"userName"
,
user
);
processReferenceable
.
set
(
"userName"
,
user
);
List
<
Referenceable
>
source
=
new
ArrayList
<>();
List
<
Referenceable
>
source
=
new
ArrayList
<>();
...
...
addons/hive-bridge/src/main/java/org/apache/hadoop/metadata/hive/model/HiveDataModelGenerator.java
View file @
5e815267
...
@@ -367,7 +367,7 @@ public class HiveDataModelGenerator {
...
@@ -367,7 +367,7 @@ public class HiveDataModelGenerator {
private
void
createTableClass
()
throws
MetadataException
{
private
void
createTableClass
()
throws
MetadataException
{
AttributeDefinition
[]
attributeDefinitions
=
new
AttributeDefinition
[]{
AttributeDefinition
[]
attributeDefinitions
=
new
AttributeDefinition
[]{
new
AttributeDefinition
(
"
tableN
ame"
,
DataTypes
.
STRING_TYPE
.
getName
(),
new
AttributeDefinition
(
"
n
ame"
,
DataTypes
.
STRING_TYPE
.
getName
(),
Multiplicity
.
REQUIRED
,
false
,
null
),
Multiplicity
.
REQUIRED
,
false
,
null
),
new
AttributeDefinition
(
"dbName"
,
HiveDataTypes
.
HIVE_DB
.
getName
(),
new
AttributeDefinition
(
"dbName"
,
HiveDataTypes
.
HIVE_DB
.
getName
(),
Multiplicity
.
REQUIRED
,
false
,
null
),
Multiplicity
.
REQUIRED
,
false
,
null
),
...
@@ -384,9 +384,9 @@ public class HiveDataModelGenerator {
...
@@ -384,9 +384,9 @@ public class HiveDataModelGenerator {
new
AttributeDefinition
(
"partitionKeys"
,
new
AttributeDefinition
(
"partitionKeys"
,
DataTypes
.
arrayTypeName
(
HiveDataTypes
.
HIVE_COLUMN
.
getName
()),
DataTypes
.
arrayTypeName
(
HiveDataTypes
.
HIVE_COLUMN
.
getName
()),
Multiplicity
.
OPTIONAL
,
false
,
null
),
Multiplicity
.
OPTIONAL
,
false
,
null
),
//
new AttributeDefinition("columns",
new
AttributeDefinition
(
"columns"
,
//
DataTypes.arrayTypeName(HiveDataTypes.HIVE_COLUMN.getName()),
DataTypes
.
arrayTypeName
(
HiveDataTypes
.
HIVE_COLUMN
.
getName
()),
// Multiplicity.COLLECTION
, true, null),
Multiplicity
.
OPTIONAL
,
true
,
null
),
new
AttributeDefinition
(
"parameters"
,
STRING_MAP_TYPE
.
getName
(),
new
AttributeDefinition
(
"parameters"
,
STRING_MAP_TYPE
.
getName
(),
Multiplicity
.
OPTIONAL
,
false
,
null
),
Multiplicity
.
OPTIONAL
,
false
,
null
),
new
AttributeDefinition
(
"viewOriginalText"
,
DataTypes
.
STRING_TYPE
.
getName
(),
new
AttributeDefinition
(
"viewOriginalText"
,
DataTypes
.
STRING_TYPE
.
getName
(),
...
@@ -480,7 +480,7 @@ public class HiveDataModelGenerator {
...
@@ -480,7 +480,7 @@ public class HiveDataModelGenerator {
private
void
createProcessClass
()
throws
MetadataException
{
private
void
createProcessClass
()
throws
MetadataException
{
AttributeDefinition
[]
attributeDefinitions
=
new
AttributeDefinition
[]{
AttributeDefinition
[]
attributeDefinitions
=
new
AttributeDefinition
[]{
new
AttributeDefinition
(
"
processN
ame"
,
DataTypes
.
STRING_TYPE
.
getName
(),
new
AttributeDefinition
(
"
n
ame"
,
DataTypes
.
STRING_TYPE
.
getName
(),
Multiplicity
.
REQUIRED
,
false
,
null
),
Multiplicity
.
REQUIRED
,
false
,
null
),
new
AttributeDefinition
(
"startTime"
,
DataTypes
.
INT_TYPE
.
getName
(),
new
AttributeDefinition
(
"startTime"
,
DataTypes
.
INT_TYPE
.
getName
(),
Multiplicity
.
REQUIRED
,
false
,
null
),
Multiplicity
.
REQUIRED
,
false
,
null
),
...
...
addons/hive-bridge/src/site/twiki/Bridge-Hive.twiki
View file @
5e815267
...
@@ -7,7 +7,7 @@ Hive metadata can be modelled in DGI using its Type System. The default modellin
...
@@ -7,7 +7,7 @@ Hive metadata can be modelled in DGI using its Type System. The default modellin
* hive_order(StructType) - [col, order]
* hive_order(StructType) - [col, order]
* hive_resourceuri(StructType) - [resourceType, uri]
* hive_resourceuri(StructType) - [resourceType, uri]
* hive_serde(StructType) - [name, serializationLib, parameters]
* hive_serde(StructType) - [name, serializationLib, parameters]
* hive_process(ClassType) - [
processN
ame, startTime, endTime, userName, sourceTableNames, targetTableNames, queryText, queryPlan, queryId, queryGraph]
* hive_process(ClassType) - [
n
ame, startTime, endTime, userName, sourceTableNames, targetTableNames, queryText, queryPlan, queryId, queryGraph]
* hive_function(ClassType) - [functionName, dbName, className, ownerName, ownerType, createTime, functionType, resourceUris]
* hive_function(ClassType) - [functionName, dbName, className, ownerName, ownerType, createTime, functionType, resourceUris]
* hive_type(ClassType) - [name, type1, type2, fields]
* hive_type(ClassType) - [name, type1, type2, fields]
* hive_partition(ClassType) - [values, dbName, tableName, createTime, lastAccessTime, sd, parameters]
* hive_partition(ClassType) - [values, dbName, tableName, createTime, lastAccessTime, sd, parameters]
...
@@ -16,7 +16,7 @@ Hive metadata can be modelled in DGI using its Type System. The default modellin
...
@@ -16,7 +16,7 @@ Hive metadata can be modelled in DGI using its Type System. The default modellin
* hive_role(ClassType) - [roleName, createTime, ownerName]
* hive_role(ClassType) - [roleName, createTime, ownerName]
* hive_column(ClassType) - [name, type, comment]
* hive_column(ClassType) - [name, type, comment]
* hive_db(ClassType) - [name, description, locationUri, parameters, ownerName, ownerType]
* hive_db(ClassType) - [name, description, locationUri, parameters, ownerName, ownerType]
* hive_table(ClassType) - [
tableName, dbName, owner, createTime, lastAccessTime, retention, sd, partitionKey
s, parameters, viewOriginalText, viewExpandedText, tableType, temporary]
* hive_table(ClassType) - [
name, dbName, owner, createTime, lastAccessTime, retention, sd, partitionKeys, column
s, parameters, viewOriginalText, viewExpandedText, tableType, temporary]
---++ Importing Hive Metadata
---++ Importing Hive Metadata
...
@@ -31,7 +31,7 @@ hive conf directory:
...
@@ -31,7 +31,7 @@ hive conf directory:
</property>
</property>
</verbatim>
</verbatim>
Usage: <dgi package>/bin/import-hive.sh
Usage: <dgi package>/bin/import-hive.sh
. The logs are in <dgi package>/logs/import-hive.log
---++ Hive Hook
---++ Hive Hook
...
...
addons/hive-bridge/src/test/java/org/apache/hadoop/metadata/hive/hook/HiveHookIT.java
View file @
5e815267
...
@@ -106,7 +106,7 @@ public class HiveHookIT {
...
@@ -106,7 +106,7 @@ public class HiveHookIT {
}
}
private
void
assertTableIsRegistered
(
String
tableName
)
throws
Exception
{
private
void
assertTableIsRegistered
(
String
tableName
)
throws
Exception
{
assertInstanceIsRegistered
(
HiveDataTypes
.
HIVE_TABLE
.
getName
(),
"
tableN
ame"
,
tableName
);
assertInstanceIsRegistered
(
HiveDataTypes
.
HIVE_TABLE
.
getName
(),
"
n
ame"
,
tableName
);
}
}
private
void
assertDatabaseIsRegistered
(
String
dbName
)
throws
Exception
{
private
void
assertDatabaseIsRegistered
(
String
dbName
)
throws
Exception
{
...
...
addons/hive-bridge/src/test/java/org/apache/hadoop/metadata/hive/hook/SSLAndKerberosHiveHookIT.java
View file @
5e815267
...
@@ -236,7 +236,7 @@ public class SSLAndKerberosHiveHookIT extends BaseSSLAndKerberosTest {
...
@@ -236,7 +236,7 @@ public class SSLAndKerberosHiveHookIT extends BaseSSLAndKerberosTest {
}
}
private
void
assertTableIsRegistered
(
String
tableName
)
throws
Exception
{
private
void
assertTableIsRegistered
(
String
tableName
)
throws
Exception
{
assertInstanceIsRegistered
(
HiveDataTypes
.
HIVE_TABLE
.
getName
(),
"
tableN
ame"
,
tableName
);
assertInstanceIsRegistered
(
HiveDataTypes
.
HIVE_TABLE
.
getName
(),
"
n
ame"
,
tableName
);
}
}
private
void
assertDatabaseIsRegistered
(
String
dbName
)
throws
Exception
{
private
void
assertDatabaseIsRegistered
(
String
dbName
)
throws
Exception
{
...
...
addons/hive-bridge/src/test/java/org/apache/hadoop/metadata/hive/hook/SSLHiveHookIT.java
View file @
5e815267
...
@@ -239,7 +239,7 @@ public class SSLHiveHookIT {
...
@@ -239,7 +239,7 @@ public class SSLHiveHookIT {
}
}
private
void
assertTableIsRegistered
(
String
tableName
)
throws
Exception
{
private
void
assertTableIsRegistered
(
String
tableName
)
throws
Exception
{
assertInstanceIsRegistered
(
HiveDataTypes
.
HIVE_TABLE
.
getName
(),
"
tableN
ame"
,
tableName
);
assertInstanceIsRegistered
(
HiveDataTypes
.
HIVE_TABLE
.
getName
(),
"
n
ame"
,
tableName
);
}
}
private
void
assertDatabaseIsRegistered
(
String
dbName
)
throws
Exception
{
private
void
assertDatabaseIsRegistered
(
String
dbName
)
throws
Exception
{
...
...
client/src/main/java/org/apache/hadoop/metadata/MetadataServiceClient.java
View file @
5e815267
...
@@ -26,6 +26,8 @@ import com.sun.jersey.client.urlconnection.URLConnectionClientHandler;
...
@@ -26,6 +26,8 @@ import com.sun.jersey.client.urlconnection.URLConnectionClientHandler;
import
org.apache.commons.configuration.PropertiesConfiguration
;
import
org.apache.commons.configuration.PropertiesConfiguration
;
import
org.apache.hadoop.metadata.security.SecureClientUtils
;
import
org.apache.hadoop.metadata.security.SecureClientUtils
;
import
org.apache.hadoop.metadata.typesystem.ITypedReferenceableInstance
;
import
org.apache.hadoop.metadata.typesystem.ITypedReferenceableInstance
;
import
org.apache.hadoop.metadata.typesystem.Referenceable
;
import
org.apache.hadoop.metadata.typesystem.json.InstanceSerialization
;
import
org.apache.hadoop.metadata.typesystem.json.Serialization
;
import
org.apache.hadoop.metadata.typesystem.json.Serialization
;
import
org.codehaus.jettison.json.JSONArray
;
import
org.codehaus.jettison.json.JSONArray
;
import
org.codehaus.jettison.json.JSONException
;
import
org.codehaus.jettison.json.JSONException
;
...
@@ -50,7 +52,11 @@ public class MetadataServiceClient {
...
@@ -50,7 +52,11 @@ public class MetadataServiceClient {
public
static
final
String
REQUEST_ID
=
"requestId"
;
public
static
final
String
REQUEST_ID
=
"requestId"
;
public
static
final
String
RESULTS
=
"results"
;
public
static
final
String
RESULTS
=
"results"
;
public
static
final
String
TOTAL_SIZE
=
"totalSize"
;
public
static
final
String
TOTAL_SIZE
=
"totalSize"
;
private
static
final
String
BASE_URI
=
"api/metadata/"
;
private
static
final
String
URI_TYPES
=
"types"
;
private
static
final
String
URI_ENTITIES
=
"entities"
;
private
static
final
String
URI_TRAITS
=
"traits"
;
private
static
final
String
URI_SEARCH
=
"discovery/search"
;
private
WebResource
service
;
private
WebResource
service
;
...
@@ -81,27 +87,27 @@ public class MetadataServiceClient {
...
@@ -81,27 +87,27 @@ public class MetadataServiceClient {
static
enum
API
{
static
enum
API
{
//Type operations
//Type operations
CREATE_TYPE
(
"api/metadata/types/submit"
,
HttpMethod
.
POST
),
CREATE_TYPE
(
BASE_URI
+
URI_TYPES
,
HttpMethod
.
POST
),
GET_TYPE
(
"api/metadata/types/definition"
,
HttpMethod
.
GET
),
GET_TYPE
(
BASE_URI
+
URI_TYPES
,
HttpMethod
.
GET
),
LIST_TYPES
(
"api/metadata/types/list"
,
HttpMethod
.
GET
),
LIST_TYPES
(
BASE_URI
+
URI_TYPES
,
HttpMethod
.
GET
),
LIST_TRAIT_TYPES
(
"api/metadata/types/traits/lis
t"
,
HttpMethod
.
GET
),
LIST_TRAIT_TYPES
(
BASE_URI
+
URI_TYPES
+
"?type=trai
t"
,
HttpMethod
.
GET
),
//Entity operations
//Entity operations
CREATE_ENTITY
(
"api/metadata/entities/submit"
,
HttpMethod
.
POST
),
CREATE_ENTITY
(
BASE_URI
+
URI_ENTITIES
,
HttpMethod
.
POST
),
GET_ENTITY
(
"api/metadata/entities/definition"
,
HttpMethod
.
GET
),
GET_ENTITY
(
BASE_URI
+
URI_ENTITIES
,
HttpMethod
.
GET
),
UPDATE_ENTITY
(
"api/metadata/entities/update"
,
HttpMethod
.
PUT
),
UPDATE_ENTITY
(
BASE_URI
+
URI_ENTITIES
,
HttpMethod
.
PUT
),
LIST_ENTITY
(
"api/metadata/entities/list
"
,
HttpMethod
.
GET
),
LIST_ENTITY
(
BASE_URI
+
URI_ENTITIES
+
"?type=
"
,
HttpMethod
.
GET
),
//Trait operations
//Trait operations
ADD_TRAITS
(
"api/metadata/traits/add"
,
HttpMethod
.
POST
),
ADD_TRAITS
(
BASE_URI
+
URI_TRAITS
,
HttpMethod
.
POST
),
DELETE_TRAITS
(
"api/metadata/traits/delete"
,
HttpMethod
.
PUT
),
DELETE_TRAITS
(
BASE_URI
+
URI_TRAITS
,
HttpMethod
.
DELETE
),
LIST_TRAITS
(
"api/metadata/traits/list"
,
HttpMethod
.
GET
),
LIST_TRAITS
(
BASE_URI
+
URI_TRAITS
,
HttpMethod
.
GET
),
//Search operations
//Search operations
SEARCH
(
"api/metadata/discovery/search"
,
HttpMethod
.
GET
),
SEARCH
(
BASE_URI
+
URI_SEARCH
,
HttpMethod
.
GET
),
SEARCH_DSL
(
"api/metadata/discovery/search
/dsl"
,
HttpMethod
.
GET
),
SEARCH_DSL
(
BASE_URI
+
URI_SEARCH
+
"
/dsl"
,
HttpMethod
.
GET
),
SEARCH_GREMLIN
(
"api/metadata/discovery/search
/gremlin"
,
HttpMethod
.
GET
),
SEARCH_GREMLIN
(
BASE_URI
+
URI_SEARCH
+
"
/gremlin"
,
HttpMethod
.
GET
),
SEARCH_FULL_TEXT
(
"api/metadata/discovery/search
/fulltext"
,
HttpMethod
.
GET
);
SEARCH_FULL_TEXT
(
BASE_URI
+
URI_SEARCH
+
"
/fulltext"
,
HttpMethod
.
GET
);
private
final
String
method
;
private
final
String
method
;
private
final
String
path
;
private
final
String
path
;
...
@@ -176,11 +182,11 @@ public class MetadataServiceClient {
...
@@ -176,11 +182,11 @@ public class MetadataServiceClient {
* @return result json object
* @return result json object
* @throws MetadataServiceException
* @throws MetadataServiceException
*/
*/
public
ITypedReferenceableInstanc
e
getEntity
(
String
guid
)
throws
MetadataServiceException
{
public
Referenceabl
e
getEntity
(
String
guid
)
throws
MetadataServiceException
{
JSONObject
jsonResponse
=
callAPI
(
API
.
GET_ENTITY
,
null
,
guid
);
JSONObject
jsonResponse
=
callAPI
(
API
.
GET_ENTITY
,
null
,
guid
);
try
{
try
{
String
entityInstanceDefinition
=
jsonResponse
.
getString
(
MetadataServiceClient
.
RESULTS
);
String
entityInstanceDefinition
=
jsonResponse
.
getString
(
MetadataServiceClient
.
RESULTS
);
return
Serialization
.
fromJson
(
entityInstanceDefinition
);
return
InstanceSerialization
.
fromJsonReferenceable
(
entityInstanceDefinition
,
true
);
}
catch
(
JSONException
e
)
{
}
catch
(
JSONException
e
)
{
throw
new
MetadataServiceException
(
e
);
throw
new
MetadataServiceException
(
e
);
}
}
...
...
pom.xml
View file @
5e815267
...
@@ -86,6 +86,7 @@
...
@@ -86,6 +86,7 @@
<scala.version>
2.10.4
</scala.version>
<scala.version>
2.10.4
</scala.version>
<scala.binary.version>
2.10
</scala.binary.version>
<scala.binary.version>
2.10
</scala.binary.version>
<scala.macros.version>
2.0.1
</scala.macros.version>
<scala.macros.version>
2.0.1
</scala.macros.version>
<json.version>
3.2.11
</json.version>
<log4j.version>
1.2.17
</log4j.version>
<log4j.version>
1.2.17
</log4j.version>
<akka.version>
2.3.7
</akka.version>
<akka.version>
2.3.7
</akka.version>
<spray.version>
1.3.1
</spray.version>
<spray.version>
1.3.1
</spray.version>
...
@@ -506,7 +507,7 @@
...
@@ -506,7 +507,7 @@
<dependency>
<dependency>
<groupId>
org.json4s
</groupId>
<groupId>
org.json4s
</groupId>
<artifactId>
json4s-native_2.10
</artifactId>
<artifactId>
json4s-native_2.10
</artifactId>
<version>
3.2.11
</version>
<version>
${json.version}
</version>
</dependency>
</dependency>
<dependency>
<dependency>
...
@@ -945,6 +946,7 @@
...
@@ -945,6 +946,7 @@
<exclude>
**/maven-eclipse.xml
</exclude>
<exclude>
**/maven-eclipse.xml
</exclude>
<exclude>
**/.externalToolBuilders/**
</exclude>
<exclude>
**/.externalToolBuilders/**
</exclude>
<exclude>
dashboard/**
</exclude>
<exclude>
dashboard/**
</exclude>
<exclude>
**/build.log
</exclude>
</excludes>
</excludes>
</configuration>
</configuration>
<executions>
<executions>
...
...
repository/src/main/java/org/apache/hadoop/metadata/discovery/HiveLineageService.java
View file @
5e815267
...
@@ -100,17 +100,18 @@ public class HiveLineageService implements LineageService {
...
@@ -100,17 +100,18 @@ public class HiveLineageService implements LineageService {
public
String
getOutputs
(
String
tableName
)
throws
DiscoveryException
{
public
String
getOutputs
(
String
tableName
)
throws
DiscoveryException
{
LOG
.
info
(
"Fetching lineage outputs for tableName={}"
,
tableName
);
LOG
.
info
(
"Fetching lineage outputs for tableName={}"
,
tableName
);
HiveWhereUsedQuery
outputsQuery
=
new
HiveWhereUsedQuery
(
HIVE_TABLE_TYPE_NAME
,
tableName
,
HIVE_PROCESS_TYPE_NAME
,
HIVE_PROCESS_INPUT_ATTRIBUTE_NAME
,
HIVE_PROCESS_OUTPUT_ATTRIBUTE_NAME
,
Option
.
empty
(),
SELECT_ATTRIBUTES
,
true
,
graphPersistenceStrategy
,
titanGraph
);
Expressions
.
Expression
expression
=
outputsQuery
.
expr
();
LOG
.
debug
(
"Expression is ["
+
expression
.
toString
()
+
"]"
);
try
{
try
{
HiveWhereUsedQuery
outputsQuery
=
new
HiveWhereUsedQuery
(
HIVE_TABLE_TYPE_NAME
,
tableName
,
HIVE_PROCESS_TYPE_NAME
,
HIVE_PROCESS_INPUT_ATTRIBUTE_NAME
,
HIVE_PROCESS_OUTPUT_ATTRIBUTE_NAME
,
Option
.
empty
(),
SELECT_ATTRIBUTES
,
true
,
graphPersistenceStrategy
,
titanGraph
);
Expressions
.
Expression
expression
=
outputsQuery
.
expr
();
return
discoveryService
.
evaluate
(
expression
).
toJson
();
return
discoveryService
.
evaluate
(
expression
).
toJson
();
}
catch
(
Exception
e
)
{
// unable to catch ExpressionException
}
catch
(
Exception
e
)
{
// unable to catch ExpressionException
throw
new
DiscoveryException
(
"Invalid expression"
,
e
);
throw
new
DiscoveryException
(
"Invalid expression
["
+
expression
.
toString
()
+
"]
"
,
e
);
}
}
}
}
...
@@ -124,17 +125,18 @@ public class HiveLineageService implements LineageService {
...
@@ -124,17 +125,18 @@ public class HiveLineageService implements LineageService {
public
String
getInputs
(
String
tableName
)
throws
DiscoveryException
{
public
String
getInputs
(
String
tableName
)
throws
DiscoveryException
{
LOG
.
info
(
"Fetching lineage inputs for tableName={}"
,
tableName
);
LOG
.
info
(
"Fetching lineage inputs for tableName={}"
,
tableName
);
try
{
HiveLineageQuery
inputsQuery
=
new
HiveLineageQuery
(
HiveLineageQuery
inputsQuery
=
new
HiveLineageQuery
(
HIVE_TABLE_TYPE_NAME
,
tableName
,
HIVE_PROCESS_TYPE_NAME
,
HIVE_TABLE_TYPE_NAME
,
tableName
,
HIVE_PROCESS_TYPE_NAME
,
HIVE_PROCESS_INPUT_ATTRIBUTE_NAME
,
HIVE_PROCESS_OUTPUT_ATTRIBUTE_NAME
,
HIVE_PROCESS_INPUT_ATTRIBUTE_NAME
,
HIVE_PROCESS_OUTPUT_ATTRIBUTE_NAME
,
Option
.
empty
(),
SELECT_ATTRIBUTES
,
true
,
Option
.
empty
(),
SELECT_ATTRIBUTES
,
true
,
graphPersistenceStrategy
,
titanGraph
);
graphPersistenceStrategy
,
titanGraph
);
Expressions
.
Expression
expression
=
inputsQuery
.
expr
();
Expressions
.
Expression
expression
=
inputsQuery
.
expr
();
LOG
.
debug
(
"Expression is ["
+
expression
.
toString
()
+
"]"
);
try
{
return
discoveryService
.
evaluate
(
expression
).
toJson
();
return
discoveryService
.
evaluate
(
expression
).
toJson
();
}
catch
(
Exception
e
)
{
// unable to catch ExpressionException
}
catch
(
Exception
e
)
{
// unable to catch ExpressionException
throw
new
DiscoveryException
(
"Invalid expression"
,
e
);
throw
new
DiscoveryException
(
"Invalid expression
["
+
expression
.
toString
()
+
"]
"
,
e
);
}
}
}
}
...
@@ -148,9 +150,10 @@ public class HiveLineageService implements LineageService {
...
@@ -148,9 +150,10 @@ public class HiveLineageService implements LineageService {
public
String
getSchema
(
String
tableName
)
throws
DiscoveryException
{
public
String
getSchema
(
String
tableName
)
throws
DiscoveryException
{
// todo - validate if indeed this is a table type and exists
// todo - validate if indeed this is a table type and exists
String
schemaQuery
=
HIVE_TABLE_TYPE_NAME
String
schemaQuery
=
HIVE_TABLE_TYPE_NAME
+
" where name=\""
+
tableName
+
"\", "
+
" where name=\""
+
tableName
+
"\""
+
HIVE_TABLE_COLUMNS_ATTRIBUTE_NAME
;
+
", "
+
HIVE_TABLE_COLUMNS_ATTRIBUTE_NAME
// + " as column select column.name, column.dataType, column.comment";
// + " as column select column.name, column.dataType, column.comment"
;
return
discoveryService
.
searchByDSL
(
schemaQuery
);
return
discoveryService
.
searchByDSL
(
schemaQuery
);
}
}
}
}
repository/src/main/java/org/apache/hadoop/metadata/discovery/graph/GraphBackedDiscoveryService.java
View file @
5e815267
...
@@ -73,7 +73,7 @@ public class GraphBackedDiscoveryService implements DiscoveryService {
...
@@ -73,7 +73,7 @@ public class GraphBackedDiscoveryService implements DiscoveryService {
this
.
graphPersistenceStrategy
=
new
DefaultGraphPersistenceStrategy
(
metadataRepository
);
this
.
graphPersistenceStrategy
=
new
DefaultGraphPersistenceStrategy
(
metadataRepository
);
}
}
//Refer http://s3.thinkaurelius.com/docs/titan/0.5.
0
/index-backends.html for indexed query
//Refer http://s3.thinkaurelius.com/docs/titan/0.5.
4
/index-backends.html for indexed query
//http://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl-query-string-query
//http://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl-query-string-query
// .html#query-string-syntax for query syntax
// .html#query-string-syntax for query syntax
@Override
@Override
...
...
repository/src/main/java/org/apache/hadoop/metadata/repository/graph/GraphBackedMetadataRepository.java
View file @
5e815267
...
@@ -128,7 +128,16 @@ public class GraphBackedMetadataRepository implements MetadataRepository {
...
@@ -128,7 +128,16 @@ public class GraphBackedMetadataRepository implements MetadataRepository {
@Override
@Override
public
String
getEdgeLabel
(
IDataType
<?>
dataType
,
AttributeInfo
aInfo
)
{
public
String
getEdgeLabel
(
IDataType
<?>
dataType
,
AttributeInfo
aInfo
)
{
return
EDGE_LABEL_PREFIX
+
dataType
.
getName
()
+
"."
+
aInfo
.
name
;
return
getEdgeLabel
(
dataType
.
getName
(),
aInfo
.
name
);
}
public
String
getEdgeLabel
(
String
typeName
,
String
attrName
)
{
return
EDGE_LABEL_PREFIX
+
typeName
+
"."
+
attrName
;
}
public
String
getEdgeLabel
(
ITypedInstance
typedInstance
,
AttributeInfo
aInfo
)
throws
MetadataException
{
IDataType
dataType
=
typeSystem
.
getDataType
(
IDataType
.
class
,
typedInstance
.
getTypeName
());
return
getEdgeLabel
(
dataType
,
aInfo
);
}
}
@Override
@Override
...
@@ -275,7 +284,7 @@ public class GraphBackedMetadataRepository implements MetadataRepository {
...
@@ -275,7 +284,7 @@ public class GraphBackedMetadataRepository implements MetadataRepository {
}
}
final
String
entityTypeName
=
getTypeName
(
instanceVertex
);
final
String
entityTypeName
=
getTypeName
(
instanceVertex
);
String
relationshipLabel
=
entityTypeName
+
"."
+
traitNameToBeDeleted
;
String
relationshipLabel
=
getEdgeLabel
(
entityTypeName
,
traitNameToBeDeleted
)
;
Iterator
<
Edge
>
results
=
instanceVertex
.
getEdges
(
Iterator
<
Edge
>
results
=
instanceVertex
.
getEdges
(
Direction
.
OUT
,
relationshipLabel
).
iterator
();
Direction
.
OUT
,
relationshipLabel
).
iterator
();
if
(
results
.
hasNext
())
{
// there should only be one edge for this label
if
(
results
.
hasNext
())
{
// there should only be one edge for this label
...
@@ -673,6 +682,7 @@ public class GraphBackedMetadataRepository implements MetadataRepository {
...
@@ -673,6 +682,7 @@ public class GraphBackedMetadataRepository implements MetadataRepository {
Object
attrValue
=
typedInstance
.
get
(
attributeInfo
.
name
);
Object
attrValue
=
typedInstance
.
get
(
attributeInfo
.
name
);
LOG
.
debug
(
"mapping attribute {} = {}"
,
attributeInfo
.
name
,
attrValue
);
LOG
.
debug
(
"mapping attribute {} = {}"
,
attributeInfo
.
name
,
attrValue
);
final
String
propertyName
=
getQualifiedName
(
typedInstance
,
attributeInfo
);
final
String
propertyName
=
getQualifiedName
(
typedInstance
,
attributeInfo
);
String
edgeLabel
=
getEdgeLabel
(
typedInstance
,
attributeInfo
);
if
(
attrValue
==
null
)
{
if
(
attrValue
==
null
)
{
return
;
return
;
}
}
...
@@ -698,11 +708,10 @@ public class GraphBackedMetadataRepository implements MetadataRepository {
...
@@ -698,11 +708,10 @@ public class GraphBackedMetadataRepository implements MetadataRepository {
case
STRUCT:
case
STRUCT:
Vertex
structInstanceVertex
=
mapStructInstanceToVertex
(
id
,
Vertex
structInstanceVertex
=
mapStructInstanceToVertex
(
id
,
(
ITypedStruct
)
typedInstance
.
get
(
attributeInfo
.
name
),
(
ITypedStruct
)
typedInstance
.
get
(
attributeInfo
.
name
),
attributeInfo
,
idToVertexMap
);
attributeInfo
,
idToVertexMap
);
// add an edge to the newly created vertex from the parent
// add an edge to the newly created vertex from the parent
GraphHelper
.
addEdge
(
GraphHelper
.
addEdge
(
titanGraph
,
instanceVertex
,
structInstanceVertex
,
propertyName
);
titanGraph
,
instanceVertex
,
structInstanceVertex
,
edgeLabel
);
break
;
break
;
case
TRAIT:
case
TRAIT:
...
@@ -712,7 +721,7 @@ public class GraphBackedMetadataRepository implements MetadataRepository {
...
@@ -712,7 +721,7 @@ public class GraphBackedMetadataRepository implements MetadataRepository {
case
CLASS:
case
CLASS:
Id
referenceId
=
(
Id
)
typedInstance
.
get
(
attributeInfo
.
name
);
Id
referenceId
=
(
Id
)
typedInstance
.
get
(
attributeInfo
.
name
);
mapClassReferenceAsEdge
(
mapClassReferenceAsEdge
(
instanceVertex
,
idToVertexMap
,
propertyName
,
referenceId
);
instanceVertex
,
idToVertexMap
,
edgeLabel
,
referenceId
);
break
;
break
;
default
:
default
:
...
@@ -886,7 +895,7 @@ public class GraphBackedMetadataRepository implements MetadataRepository {
...
@@ -886,7 +895,7 @@ public class GraphBackedMetadataRepository implements MetadataRepository {
traitInstance
.
fieldMapping
().
fields
,
idToVertexMap
);
traitInstance
.
fieldMapping
().
fields
,
idToVertexMap
);
// add an edge to the newly created vertex from the parent
// add an edge to the newly created vertex from the parent
String
relationshipLabel
=
typedInstanceTypeName
+
"."
+
traitName
;
String
relationshipLabel
=
getEdgeLabel
(
typedInstanceTypeName
,
traitName
)
;
GraphHelper
.
addEdge
(
GraphHelper
.
addEdge
(
titanGraph
,
parentInstanceVertex
,
traitInstanceVertex
,
relationshipLabel
);
titanGraph
,
parentInstanceVertex
,
traitInstanceVertex
,
relationshipLabel
);
}
}
...
@@ -1017,7 +1026,7 @@ public class GraphBackedMetadataRepository implements MetadataRepository {
...
@@ -1017,7 +1026,7 @@ public class GraphBackedMetadataRepository implements MetadataRepository {
break
;
break
;
case
CLASS:
case
CLASS:
String
relationshipLabel
=
get
QualifiedName
(
typedInstance
,
attributeInfo
);
String
relationshipLabel
=
get
EdgeLabel
(
typedInstance
,
attributeInfo
);
Object
idOrInstance
=
mapClassReferenceToVertex
(
instanceVertex
,
Object
idOrInstance
=
mapClassReferenceToVertex
(
instanceVertex
,
attributeInfo
,
relationshipLabel
,
attributeInfo
.
dataType
());
attributeInfo
,
relationshipLabel
,
attributeInfo
.
dataType
());
typedInstance
.
set
(
attributeInfo
.
name
,
idOrInstance
);
typedInstance
.
set
(
attributeInfo
.
name
,
idOrInstance
);
...
@@ -1221,7 +1230,7 @@ public class GraphBackedMetadataRepository implements MetadataRepository {
...
@@ -1221,7 +1230,7 @@ public class GraphBackedMetadataRepository implements MetadataRepository {
ITypedStruct
structInstance
=
structType
.
createInstance
();
ITypedStruct
structInstance
=
structType
.
createInstance
();
typedInstance
.
set
(
attributeInfo
.
name
,
structInstance
);
typedInstance
.
set
(
attributeInfo
.
name
,
structInstance
);
String
relationshipLabel
=
get
QualifiedName
(
typedInstance
,
attributeInfo
);
String
relationshipLabel
=
get
EdgeLabel
(
typedInstance
,
attributeInfo
);
LOG
.
debug
(
"Finding edge for {} -> label {} "
,
instanceVertex
,
relationshipLabel
);
LOG
.
debug
(
"Finding edge for {} -> label {} "
,
instanceVertex
,
relationshipLabel
);
for
(
Edge
edge
:
instanceVertex
.
getEdges
(
Direction
.
OUT
,
relationshipLabel
))
{
for
(
Edge
edge
:
instanceVertex
.
getEdges
(
Direction
.
OUT
,
relationshipLabel
))
{
final
Vertex
structInstanceVertex
=
edge
.
getVertex
(
Direction
.
IN
);
final
Vertex
structInstanceVertex
=
edge
.
getVertex
(
Direction
.
IN
);
...
...
repository/src/main/java/org/apache/hadoop/metadata/repository/graph/GraphBackedSearchIndexer.java
View file @
5e815267
...
@@ -219,7 +219,8 @@ public class GraphBackedSearchIndexer implements SearchIndexer {
...
@@ -219,7 +219,8 @@ public class GraphBackedSearchIndexer implements SearchIndexer {
case
CLASS:
case
CLASS:
// this is only A reference, index the attribute for edge
// this is only A reference, index the attribute for edge
createEdgeMixedIndex
(
propertyName
);
// Commenting this out since we do not need an index for edge here
//createEdgeMixedIndex(propertyName);
break
;
break
;
default
:
default
:
...
@@ -314,15 +315,23 @@ public class GraphBackedSearchIndexer implements SearchIndexer {
...
@@ -314,15 +315,23 @@ public class GraphBackedSearchIndexer implements SearchIndexer {
.
dataType
(
propertyClass
)
.
dataType
(
propertyClass
)
.
make
();
.
make
();
TitanGraphIndex
vertexIndex
=
management
.
getGraphIndex
(
Constants
.
VERTEX_INDEX
);
if
(
propertyClass
==
Boolean
.
class
)
{
management
.
addIndexKey
(
vertexIndex
,
propertyKey
);
//Use standard index as backing index only supports string, int and geo types
management
.
commit
();
management
.
buildIndex
(
propertyName
,
Vertex
.
class
).
addKey
(
propertyKey
).
buildCompositeIndex
();
management
.
commit
();
}
else
{
//Use backing index
TitanGraphIndex
vertexIndex
=
management
.
getGraphIndex
(
Constants
.
VERTEX_INDEX
);
management
.
addIndexKey
(
vertexIndex
,
propertyKey
);
management
.
commit
();
}
LOG
.
info
(
"Created mixed vertex index for property {}"
,
propertyName
);
LOG
.
info
(
"Created mixed vertex index for property {}"
,
propertyName
);
}
}
return
propertyKey
;
return
propertyKey
;
}
}
/* Commenting this out since we do not need an index for edge label here
private void createEdgeMixedIndex(String propertyName) {
private void createEdgeMixedIndex(String propertyName) {
TitanManagement management = titanGraph.getManagementSystem();
TitanManagement management = titanGraph.getManagementSystem();
EdgeLabel edgeLabel = management.getEdgeLabel(propertyName);
EdgeLabel edgeLabel = management.getEdgeLabel(propertyName);
...
@@ -332,5 +341,5 @@ public class GraphBackedSearchIndexer implements SearchIndexer {
...
@@ -332,5 +341,5 @@ public class GraphBackedSearchIndexer implements SearchIndexer {
management.commit();
management.commit();
LOG.info("Created index for edge label {}", propertyName);
LOG.info("Created index for edge label {}", propertyName);
}
}
}
}
*/
}
}
repository/src/main/java/org/apache/hadoop/metadata/repository/graph/TitanGraphProvider.java
View file @
5e815267
...
@@ -25,6 +25,8 @@ import org.apache.commons.configuration.ConfigurationException;
...
@@ -25,6 +25,8 @@ import org.apache.commons.configuration.ConfigurationException;
import
org.apache.commons.configuration.PropertiesConfiguration
;
import
org.apache.commons.configuration.PropertiesConfiguration
;
import
org.apache.hadoop.metadata.MetadataException
;
import
org.apache.hadoop.metadata.MetadataException
;
import
org.apache.hadoop.metadata.PropertiesUtil
;
import
org.apache.hadoop.metadata.PropertiesUtil
;
import
org.slf4j.Logger
;
import
org.slf4j.LoggerFactory
;
import
javax.inject.Singleton
;
import
javax.inject.Singleton
;
import
java.util.Iterator
;
import
java.util.Iterator
;
...
@@ -34,6 +36,8 @@ import java.util.Iterator;
...
@@ -34,6 +36,8 @@ import java.util.Iterator;
*/
*/
public
class
TitanGraphProvider
implements
GraphProvider
<
TitanGraph
>
{
public
class
TitanGraphProvider
implements
GraphProvider
<
TitanGraph
>
{
private
static
final
Logger
LOG
=
LoggerFactory
.
getLogger
(
TitanGraphProvider
.
class
);
/**
/**
* Constant for the configuration property that indicates the prefix.
* Constant for the configuration property that indicates the prefix.
*/
*/
...
@@ -51,6 +55,7 @@ public class TitanGraphProvider implements GraphProvider<TitanGraph> {
...
@@ -51,6 +55,7 @@ public class TitanGraphProvider implements GraphProvider<TitanGraph> {
String
value
=
(
String
)
configProperties
.
getProperty
(
key
);
String
value
=
(
String
)
configProperties
.
getProperty
(
key
);
key
=
key
.
substring
(
METADATA_PREFIX
.
length
());
key
=
key
.
substring
(
METADATA_PREFIX
.
length
());
graphConfig
.
setProperty
(
key
,
value
);
graphConfig
.
setProperty
(
key
,
value
);
LOG
.
info
(
"Using graph property {}={}"
,
key
,
value
);
}
}
}
}
...
...
repository/src/main/java/org/apache/hadoop/metadata/repository/typestore/GraphBackedTypeStore.java
View file @
5e815267
...
@@ -207,46 +207,53 @@ public class GraphBackedTypeStore implements ITypeStore {
...
@@ -207,46 +207,53 @@ public class GraphBackedTypeStore implements ITypeStore {
@Override
@Override
public
TypesDef
restore
()
throws
MetadataException
{
public
TypesDef
restore
()
throws
MetadataException
{
//Get all vertices for type system
try
{
Iterator
vertices
=
titanGraph
.
query
().
has
(
Constants
.
VERTEX_TYPE_PROPERTY_KEY
,
VERTEX_TYPE
).
vertices
().
iterator
();
titanGraph
.
rollback
();
//Cleanup previous state
//Get all vertices for type system
ImmutableList
.
Builder
<
EnumTypeDefinition
>
enums
=
ImmutableList
.
builder
();
Iterator
vertices
=
ImmutableList
.
Builder
<
StructTypeDefinition
>
structs
=
ImmutableList
.
builder
();
titanGraph
.
query
().
has
(
Constants
.
VERTEX_TYPE_PROPERTY_KEY
,
VERTEX_TYPE
).
vertices
().
iterator
();
ImmutableList
.
Builder
<
HierarchicalTypeDefinition
<
ClassType
>>
classTypes
=
ImmutableList
.
builder
();
ImmutableList
.
Builder
<
HierarchicalTypeDefinition
<
TraitType
>>
traits
=
ImmutableList
.
builder
();
ImmutableList
.
Builder
<
EnumTypeDefinition
>
enums
=
ImmutableList
.
builder
();
ImmutableList
.
Builder
<
StructTypeDefinition
>
structs
=
ImmutableList
.
builder
();
while
(
vertices
.
hasNext
())
{
ImmutableList
.
Builder
<
HierarchicalTypeDefinition
<
ClassType
>>
classTypes
=
ImmutableList
.
builder
();
Vertex
vertex
=
(
Vertex
)
vertices
.
next
();
ImmutableList
.
Builder
<
HierarchicalTypeDefinition
<
TraitType
>>
traits
=
ImmutableList
.
builder
();
DataTypes
.
TypeCategory
typeCategory
=
vertex
.
getProperty
(
Constants
.
TYPE_CATEGORY_PROPERTY_KEY
);
String
typeName
=
vertex
.
getProperty
(
Constants
.
TYPENAME_PROPERTY_KEY
);
while
(
vertices
.
hasNext
())
{
LOG
.
info
(
"Restoring type {}.{}"
,
typeCategory
,
typeName
);
Vertex
vertex
=
(
Vertex
)
vertices
.
next
();
switch
(
typeCategory
)
{
DataTypes
.
TypeCategory
typeCategory
=
vertex
.
getProperty
(
Constants
.
TYPE_CATEGORY_PROPERTY_KEY
);
case
ENUM:
String
typeName
=
vertex
.
getProperty
(
Constants
.
TYPENAME_PROPERTY_KEY
);
enums
.
add
(
getEnumType
(
vertex
));
LOG
.
info
(
"Restoring type {}.{}"
,
typeCategory
,
typeName
);
break
;
switch
(
typeCategory
)
{
case
ENUM:
case
STRUCT:
enums
.
add
(
getEnumType
(
vertex
));
AttributeDefinition
[]
attributes
=
getAttributes
(
vertex
);
break
;
structs
.
add
(
new
StructTypeDefinition
(
typeName
,
attributes
));
break
;
case
STRUCT:
AttributeDefinition
[]
attributes
=
getAttributes
(
vertex
);
case
CLASS:
structs
.
add
(
new
StructTypeDefinition
(
typeName
,
attributes
));
ImmutableList
<
String
>
superTypes
=
getSuperTypes
(
vertex
);
break
;
attributes
=
getAttributes
(
vertex
);
classTypes
.
add
(
new
HierarchicalTypeDefinition
(
ClassType
.
class
,
typeName
,
superTypes
,
attributes
));
case
CLASS:
break
;
ImmutableList
<
String
>
superTypes
=
getSuperTypes
(
vertex
);
attributes
=
getAttributes
(
vertex
);
case
TRAIT:
classTypes
.
add
(
new
HierarchicalTypeDefinition
(
ClassType
.
class
,
typeName
,
superTypes
,
attributes
));
superTypes
=
getSuperTypes
(
vertex
);
break
;
attributes
=
getAttributes
(
vertex
);
traits
.
add
(
new
HierarchicalTypeDefinition
(
TraitType
.
class
,
typeName
,
superTypes
,
attributes
));
case
TRAIT:
break
;
superTypes
=
getSuperTypes
(
vertex
);
attributes
=
getAttributes
(
vertex
);
default
:
traits
.
add
(
new
HierarchicalTypeDefinition
(
TraitType
.
class
,
typeName
,
superTypes
,
attributes
));
throw
new
IllegalArgumentException
(
"Unhandled type category "
+
typeCategory
);
break
;
default
:
throw
new
IllegalArgumentException
(
"Unhandled type category "
+
typeCategory
);
}
}
}
titanGraph
.
commit
();
return
TypeUtils
.
getTypesDef
(
enums
.
build
(),
structs
.
build
(),
traits
.
build
(),
classTypes
.
build
());
}
finally
{
titanGraph
.
rollback
();
}
}
return
TypeUtils
.
getTypesDef
(
enums
.
build
(),
structs
.
build
(),
traits
.
build
(),
classTypes
.
build
());
}
}
private
EnumTypeDefinition
getEnumType
(
Vertex
vertex
)
{
private
EnumTypeDefinition
getEnumType
(
Vertex
vertex
)
{
...
...
repository/src/main/java/org/apache/hadoop/metadata/services/DefaultMetadataService.java
View file @
5e815267
...
@@ -20,6 +20,7 @@ package org.apache.hadoop.metadata.services;
...
@@ -20,6 +20,7 @@ package org.apache.hadoop.metadata.services;
import
com.google.common.base.Preconditions
;
import
com.google.common.base.Preconditions
;
import
com.google.common.collect.ImmutableList
;
import
com.google.common.collect.ImmutableList
;
import
com.google.inject.Injector
;
import
org.apache.hadoop.metadata.MetadataException
;
import
org.apache.hadoop.metadata.MetadataException
;
import
org.apache.hadoop.metadata.discovery.SearchIndexer
;
import
org.apache.hadoop.metadata.discovery.SearchIndexer
;
import
org.apache.hadoop.metadata.listener.EntityChangeListener
;
import
org.apache.hadoop.metadata.listener.EntityChangeListener
;
...
@@ -34,11 +35,7 @@ import org.apache.hadoop.metadata.typesystem.TypesDef;
...
@@ -34,11 +35,7 @@ import org.apache.hadoop.metadata.typesystem.TypesDef;
import
org.apache.hadoop.metadata.typesystem.json.InstanceSerialization
;
import
org.apache.hadoop.metadata.typesystem.json.InstanceSerialization
;
import
org.apache.hadoop.metadata.typesystem.json.Serialization
$
;
import
org.apache.hadoop.metadata.typesystem.json.Serialization
$
;
import
org.apache.hadoop.metadata.typesystem.json.TypesSerialization
;
import
org.apache.hadoop.metadata.typesystem.json.TypesSerialization
;
import
org.apache.hadoop.metadata.typesystem.types.ClassType
;
import
org.apache.hadoop.metadata.typesystem.types.*
;
import
org.apache.hadoop.metadata.typesystem.types.IDataType
;
import
org.apache.hadoop.metadata.typesystem.types.Multiplicity
;
import
org.apache.hadoop.metadata.typesystem.types.TraitType
;
import
org.apache.hadoop.metadata.typesystem.types.TypeSystem
;
import
org.codehaus.jettison.json.JSONException
;
import
org.codehaus.jettison.json.JSONException
;
import
org.codehaus.jettison.json.JSONObject
;
import
org.codehaus.jettison.json.JSONObject
;
import
org.slf4j.Logger
;
import
org.slf4j.Logger
;
...
@@ -77,9 +74,22 @@ public class DefaultMetadataService implements MetadataService {
...
@@ -77,9 +74,22 @@ public class DefaultMetadataService implements MetadataService {
this
.
typeSystem
=
TypeSystem
.
getInstance
();
this
.
typeSystem
=
TypeSystem
.
getInstance
();
this
.
repository
=
repository
;
this
.
repository
=
repository
;
restoreTypeSystem
();
registerListener
(
searchIndexer
);
registerListener
(
searchIndexer
);
}
}
private
void
restoreTypeSystem
()
{
LOG
.
info
(
"Restoring type system from the store"
);
try
{
TypesDef
typesDef
=
typeStore
.
restore
();
typeSystem
.
defineTypes
(
typesDef
);
}
catch
(
MetadataException
e
)
{
throw
new
RuntimeException
(
e
);
}
LOG
.
info
(
"Restored type system from the store"
);
}
/**
/**
* Creates a new type based on the type system to enable adding
* Creates a new type based on the type system to enable adding
* entities (instances for types).
* entities (instances for types).
...
@@ -144,8 +154,8 @@ public class DefaultMetadataService implements MetadataService {
...
@@ -144,8 +154,8 @@ public class DefaultMetadataService implements MetadataService {
* @return list of trait type names in the type system
* @return list of trait type names in the type system
*/
*/
@Override
@Override
public
List
<
String
>
getT
raitNamesList
(
)
throws
MetadataException
{
public
List
<
String
>
getT
ypeNamesByCategory
(
DataTypes
.
TypeCategory
typeCategory
)
throws
MetadataException
{
return
typeSystem
.
getT
raitsNames
(
);
return
typeSystem
.
getT
ypeNamesByCategory
(
typeCategory
);
}
}
/**
/**
...
@@ -195,7 +205,7 @@ public class DefaultMetadataService implements MetadataService {
...
@@ -195,7 +205,7 @@ public class DefaultMetadataService implements MetadataService {
Preconditions
.
checkNotNull
(
guid
,
"guid cannot be null"
);
Preconditions
.
checkNotNull
(
guid
,
"guid cannot be null"
);
final
ITypedReferenceableInstance
instance
=
repository
.
getEntityDefinition
(
guid
);
final
ITypedReferenceableInstance
instance
=
repository
.
getEntityDefinition
(
guid
);
return
Serialization
$
.
MODULE
$
.
toJson
(
instanc
e
);
return
InstanceSerialization
.
toJson
(
instance
,
tru
e
);
}
}
/**
/**
...
...
repository/src/main/java/org/apache/hadoop/metadata/services/MetadataService.java
View file @
5e815267
...
@@ -19,6 +19,7 @@
...
@@ -19,6 +19,7 @@
package
org
.
apache
.
hadoop
.
metadata
.
services
;
package
org
.
apache
.
hadoop
.
metadata
.
services
;
import
org.apache.hadoop.metadata.MetadataException
;
import
org.apache.hadoop.metadata.MetadataException
;
import
org.apache.hadoop.metadata.typesystem.types.DataTypes
;
import
org.codehaus.jettison.json.JSONObject
;
import
org.codehaus.jettison.json.JSONObject
;
import
java.util.List
;
import
java.util.List
;
...
@@ -57,7 +58,7 @@ public interface MetadataService {
...
@@ -57,7 +58,7 @@ public interface MetadataService {
*
*
* @return list of trait type names in the type system
* @return list of trait type names in the type system
*/
*/
List
<
String
>
getT
raitNamesList
(
)
throws
MetadataException
;
List
<
String
>
getT
ypeNamesByCategory
(
DataTypes
.
TypeCategory
typeCategory
)
throws
MetadataException
;
/**
/**
* Creates an entity, instance of the type.
* Creates an entity, instance of the type.
...
...
repository/src/test/java/org/apache/hadoop/metadata/discovery/GraphBackedDiscoveryServiceTest.java
View file @
5e815267
...
@@ -29,6 +29,7 @@ import org.apache.hadoop.metadata.discovery.graph.GraphBackedDiscoveryService;
...
@@ -29,6 +29,7 @@ import org.apache.hadoop.metadata.discovery.graph.GraphBackedDiscoveryService;
import
org.apache.hadoop.metadata.query.HiveTitanSample
;
import
org.apache.hadoop.metadata.query.HiveTitanSample
;
import
org.apache.hadoop.metadata.query.QueryTestsUtils
;
import
org.apache.hadoop.metadata.query.QueryTestsUtils
;
import
org.apache.hadoop.metadata.repository.graph.GraphBackedMetadataRepository
;
import
org.apache.hadoop.metadata.repository.graph.GraphBackedMetadataRepository
;
import
org.apache.hadoop.metadata.repository.graph.GraphBackedSearchIndexer
;
import
org.apache.hadoop.metadata.repository.graph.GraphHelper
;
import
org.apache.hadoop.metadata.repository.graph.GraphHelper
;
import
org.apache.hadoop.metadata.repository.graph.GraphProvider
;
import
org.apache.hadoop.metadata.repository.graph.GraphProvider
;
import
org.apache.hadoop.metadata.typesystem.ITypedReferenceableInstance
;
import
org.apache.hadoop.metadata.typesystem.ITypedReferenceableInstance
;
...
@@ -224,6 +225,7 @@ public class GraphBackedDiscoveryServiceTest {
...
@@ -224,6 +225,7 @@ public class GraphBackedDiscoveryServiceTest {
{
"Table as _loop0 loop (LoadProcess outputTable) withPath"
},
{
"Table as _loop0 loop (LoadProcess outputTable) withPath"
},
{
"Table as src loop (LoadProcess outputTable) as dest select src.name as srcTable, dest.name as destTable withPath"
},
{
"Table as src loop (LoadProcess outputTable) as dest select src.name as srcTable, dest.name as destTable withPath"
},
{
"Table as t, sd, Column as c where t.name=\"sales_fact\" select c.name as colName, c.dataType as colType"
},
{
"Table as t, sd, Column as c where t.name=\"sales_fact\" select c.name as colName, c.dataType as colType"
},
{
"Table where name='sales_fact', db where name='Reporting'"
}
};
};
}
}
...
@@ -268,39 +270,6 @@ public class GraphBackedDiscoveryServiceTest {
...
@@ -268,39 +270,6 @@ public class GraphBackedDiscoveryServiceTest {
}
}
@Test
@Test
public
void
testSearchByDSLQuery
()
throws
Exception
{
String
dslQuery
=
"Column as PII"
;
System
.
out
.
println
(
"Executing dslQuery = "
+
dslQuery
);
String
jsonResults
=
discoveryService
.
searchByDSL
(
dslQuery
);
Assert
.
assertNotNull
(
jsonResults
);
JSONObject
results
=
new
JSONObject
(
jsonResults
);
Assert
.
assertEquals
(
results
.
length
(),
3
);
System
.
out
.
println
(
"results = "
+
results
);
Object
query
=
results
.
get
(
"query"
);
Assert
.
assertNotNull
(
query
);
JSONObject
dataType
=
results
.
getJSONObject
(
"dataType"
);
Assert
.
assertNotNull
(
dataType
);
String
typeName
=
dataType
.
getString
(
"typeName"
);
Assert
.
assertNotNull
(
typeName
);
JSONArray
rows
=
results
.
getJSONArray
(
"rows"
);
Assert
.
assertNotNull
(
rows
);
Assert
.
assertTrue
(
rows
.
length
()
>
0
);
for
(
int
index
=
0
;
index
<
rows
.
length
();
index
++)
{
JSONObject
row
=
rows
.
getJSONObject
(
index
);
String
type
=
row
.
getString
(
"$typeName$"
);
Assert
.
assertEquals
(
type
,
"Column"
);
String
name
=
row
.
getString
(
"name"
);
Assert
.
assertNotEquals
(
name
,
"null"
);
}
}
@Test
public
void
testSearchForTypeInheritance
()
throws
Exception
{
public
void
testSearchForTypeInheritance
()
throws
Exception
{
createTypesWithMultiLevelInheritance
();
createTypesWithMultiLevelInheritance
();
createInstances
();
createInstances
();
...
...
repository/src/test/java/org/apache/hadoop/metadata/repository/graph/GraphBackedMetadataRepositoryTest.java
View file @
5e815267
...
@@ -145,9 +145,8 @@ public class GraphBackedMetadataRepositoryTest {
...
@@ -145,9 +145,8 @@ public class GraphBackedMetadataRepositoryTest {
@Test
(
dependsOnMethods
=
"testSubmitEntity"
)
@Test
(
dependsOnMethods
=
"testSubmitEntity"
)
public
void
testGetTraitLabel
()
throws
Exception
{
public
void
testGetTraitLabel
()
throws
Exception
{
Assert
.
assertEquals
(
repositoryService
.
getTraitLabel
(
Assert
.
assertEquals
(
repositoryService
.
getTraitLabel
(
typeSystem
.
getDataType
(
ClassType
.
class
,
TABLE_TYPE
),
typeSystem
.
getDataType
(
ClassType
.
class
,
TABLE_TYPE
),
CLASSIFICATION
),
TABLE_TYPE
+
"."
+
CLASSIFICATION
);
CLASSIFICATION
),
TABLE_TYPE
+
"."
+
CLASSIFICATION
);
}
}
@Test
@Test
...
@@ -317,6 +316,39 @@ public class GraphBackedMetadataRepositoryTest {
...
@@ -317,6 +316,39 @@ public class GraphBackedMetadataRepositoryTest {
Assert
.
assertEquals
(
repositoryService
.
getTypeName
(
tableVertex
),
TABLE_TYPE
);
Assert
.
assertEquals
(
repositoryService
.
getTypeName
(
tableVertex
),
TABLE_TYPE
);
}
}
@Test
(
dependsOnMethods
=
"testCreateEntity"
)
public
void
testSearchByDSLQuery
()
throws
Exception
{
String
dslQuery
=
"hive_database as PII"
;
System
.
out
.
println
(
"Executing dslQuery = "
+
dslQuery
);
String
jsonResults
=
discoveryService
.
searchByDSL
(
dslQuery
);
Assert
.
assertNotNull
(
jsonResults
);
JSONObject
results
=
new
JSONObject
(
jsonResults
);
Assert
.
assertEquals
(
results
.
length
(),
3
);
System
.
out
.
println
(
"results = "
+
results
);
Object
query
=
results
.
get
(
"query"
);
Assert
.
assertNotNull
(
query
);
JSONObject
dataType
=
results
.
getJSONObject
(
"dataType"
);
Assert
.
assertNotNull
(
dataType
);
String
typeName
=
dataType
.
getString
(
"typeName"
);
Assert
.
assertNotNull
(
typeName
);
JSONArray
rows
=
results
.
getJSONArray
(
"rows"
);
Assert
.
assertNotNull
(
rows
);
Assert
.
assertTrue
(
rows
.
length
()
>
0
);
for
(
int
index
=
0
;
index
<
rows
.
length
();
index
++)
{
JSONObject
row
=
rows
.
getJSONObject
(
index
);
String
type
=
row
.
getString
(
"$typeName$"
);
Assert
.
assertEquals
(
type
,
"hive_database"
);
String
name
=
row
.
getString
(
"name"
);
Assert
.
assertEquals
(
name
,
DATABASE_NAME
);
}
}
/**
/**
* Full text search requires GraphBackedSearchIndexer, and GraphBackedSearchIndexer can't be enabled in
* Full text search requires GraphBackedSearchIndexer, and GraphBackedSearchIndexer can't be enabled in
* GraphBackedDiscoveryServiceTest because of its test data. So, test for full text search is in
* GraphBackedDiscoveryServiceTest because of its test data. So, test for full text search is in
...
...
src/conf/application.properties
View file @
5e815267
...
@@ -30,12 +30,13 @@ metadata.graph.index.search.elasticsearch.create.sleep=2000
...
@@ -30,12 +30,13 @@ metadata.graph.index.search.elasticsearch.create.sleep=2000
######### Hive Lineage Configs #########
######### Hive Lineage Configs #########
# This models follows the quick-start guide
# This models follows the quick-start guide
metadata.lineage.hive.table.type.name
=
Table
metadata.lineage.hive.table.type.name
=
hive_table
metadata.lineage.hive.column.type.name
=
Column
metadata.lineage.hive.table.column.name
=
columns
metadata.lineage.hive.table.column.name
=
columns
metadata.lineage.hive.process.type.name
=
LoadP
rocess
metadata.lineage.hive.process.type.name
=
hive_p
rocess
metadata.lineage.hive.process.inputs.name
=
inputTables
metadata.lineage.hive.process.inputs.name
=
inputTables
metadata.lineage.hive.process.outputs.name
=
outputTables
metadata.lineage.hive.process.outputs.name
=
outputTables
#Currently unused
#metadata.lineage.hive.column.type.name=Column
######### Security Properties #########
######### Security Properties #########
...
...
typesystem/src/main/java/org/apache/hadoop/metadata/typesystem/types/TypeSystem.java
View file @
5e815267
...
@@ -18,7 +18,9 @@
...
@@ -18,7 +18,9 @@
package
org
.
apache
.
hadoop
.
metadata
.
typesystem
.
types
;
package
org
.
apache
.
hadoop
.
metadata
.
typesystem
.
types
;
import
com.google.common.collect.ArrayListMultimap
;
import
com.google.common.collect.ImmutableList
;
import
com.google.common.collect.ImmutableList
;
import
com.google.common.collect.Multimap
;
import
org.apache.hadoop.metadata.MetadataException
;
import
org.apache.hadoop.metadata.MetadataException
;
import
org.apache.hadoop.metadata.classification.InterfaceAudience
;
import
org.apache.hadoop.metadata.classification.InterfaceAudience
;
import
org.apache.hadoop.metadata.typesystem.TypesDef
;
import
org.apache.hadoop.metadata.typesystem.TypesDef
;
...
@@ -55,9 +57,9 @@ public class TypeSystem {
...
@@ -55,9 +57,9 @@ public class TypeSystem {
private
IdType
idType
;
private
IdType
idType
;
/**
/**
* An in-memory copy of
list of trait
s for convenience.
* An in-memory copy of
type categories vs type
s for convenience.
*/
*/
private
List
<
String
>
traitTypes
;
private
Multimap
<
DataTypes
.
TypeCategory
,
String
>
typeCategoriesToTypeNamesMap
;
private
ImmutableList
<
String
>
coreTypes
;
private
ImmutableList
<
String
>
coreTypes
;
...
@@ -79,7 +81,7 @@ public class TypeSystem {
...
@@ -79,7 +81,7 @@ public class TypeSystem {
private
void
initialize
()
{
private
void
initialize
()
{
types
=
new
ConcurrentHashMap
<>();
types
=
new
ConcurrentHashMap
<>();
t
raitTypes
=
new
ArrayList
<>(
);
t
ypeCategoriesToTypeNamesMap
=
ArrayListMultimap
.
create
(
DataTypes
.
TypeCategory
.
values
().
length
,
10
);
registerPrimitiveTypes
();
registerPrimitiveTypes
();
registerCoreTypes
();
registerCoreTypes
();
...
@@ -94,12 +96,8 @@ public class TypeSystem {
...
@@ -94,12 +96,8 @@ public class TypeSystem {
return
ImmutableList
.
copyOf
(
types
.
keySet
());
return
ImmutableList
.
copyOf
(
types
.
keySet
());
}
}
public
ImmutableList
<
String
>
getTraitsNames
()
{
public
ImmutableList
<
String
>
getTypeNamesByCategory
(
DataTypes
.
TypeCategory
typeCategory
)
{
return
ImmutableList
.
copyOf
(
traitTypes
);
return
ImmutableList
.
copyOf
(
typeCategoriesToTypeNamesMap
.
get
(
typeCategory
));
}
private
void
addTraitName
(
String
traitName
)
{
traitTypes
.
add
(
traitName
);
}
}
private
void
registerPrimitiveTypes
()
{
private
void
registerPrimitiveTypes
()
{
...
@@ -114,6 +112,8 @@ public class TypeSystem {
...
@@ -114,6 +112,8 @@ public class TypeSystem {
types
.
put
(
DataTypes
.
BIGDECIMAL_TYPE
.
getName
(),
DataTypes
.
BIGDECIMAL_TYPE
);
types
.
put
(
DataTypes
.
BIGDECIMAL_TYPE
.
getName
(),
DataTypes
.
BIGDECIMAL_TYPE
);
types
.
put
(
DataTypes
.
DATE_TYPE
.
getName
(),
DataTypes
.
DATE_TYPE
);
types
.
put
(
DataTypes
.
DATE_TYPE
.
getName
(),
DataTypes
.
DATE_TYPE
);
types
.
put
(
DataTypes
.
STRING_TYPE
.
getName
(),
DataTypes
.
STRING_TYPE
);
types
.
put
(
DataTypes
.
STRING_TYPE
.
getName
(),
DataTypes
.
STRING_TYPE
);
typeCategoriesToTypeNamesMap
.
putAll
(
DataTypes
.
TypeCategory
.
PRIMITIVE
,
types
.
keySet
());
}
}
...
@@ -267,6 +267,7 @@ public class TypeSystem {
...
@@ -267,6 +267,7 @@ public class TypeSystem {
assert
elemType
!=
null
;
assert
elemType
!=
null
;
DataTypes
.
ArrayType
dT
=
new
DataTypes
.
ArrayType
(
elemType
);
DataTypes
.
ArrayType
dT
=
new
DataTypes
.
ArrayType
(
elemType
);
types
.
put
(
dT
.
getName
(),
dT
);
types
.
put
(
dT
.
getName
(),
dT
);
typeCategoriesToTypeNamesMap
.
put
(
DataTypes
.
TypeCategory
.
ARRAY
,
dT
.
getName
());
return
dT
;
return
dT
;
}
}
...
@@ -276,6 +277,7 @@ public class TypeSystem {
...
@@ -276,6 +277,7 @@ public class TypeSystem {
assert
valueType
!=
null
;
assert
valueType
!=
null
;
DataTypes
.
MapType
dT
=
new
DataTypes
.
MapType
(
keyType
,
valueType
);
DataTypes
.
MapType
dT
=
new
DataTypes
.
MapType
(
keyType
,
valueType
);
types
.
put
(
dT
.
getName
(),
dT
);
types
.
put
(
dT
.
getName
(),
dT
);
typeCategoriesToTypeNamesMap
.
put
(
DataTypes
.
TypeCategory
.
MAP
,
dT
.
getName
());
return
dT
;
return
dT
;
}
}
...
@@ -291,6 +293,7 @@ public class TypeSystem {
...
@@ -291,6 +293,7 @@ public class TypeSystem {
}
}
EnumType
eT
=
new
EnumType
(
this
,
eDef
.
name
,
eDef
.
enumValues
);
EnumType
eT
=
new
EnumType
(
this
,
eDef
.
name
,
eDef
.
enumValues
);
types
.
put
(
eDef
.
name
,
eT
);
types
.
put
(
eDef
.
name
,
eT
);
typeCategoriesToTypeNamesMap
.
put
(
DataTypes
.
TypeCategory
.
ENUM
,
eDef
.
name
);
return
eT
;
return
eT
;
}
}
...
@@ -520,17 +523,19 @@ public class TypeSystem {
...
@@ -520,17 +523,19 @@ public class TypeSystem {
for
(
StructTypeDefinition
structDef
:
structDefs
)
{
for
(
StructTypeDefinition
structDef
:
structDefs
)
{
constructStructureType
(
structDef
);
constructStructureType
(
structDef
);
typeCategoriesToTypeNamesMap
.
put
(
DataTypes
.
TypeCategory
.
CLASS
,
structDef
.
typeName
);
}
}
for
(
TraitType
traitType
:
traitTypes
)
{
for
(
TraitType
traitType
:
traitTypes
)
{
constructHierarchicalType
(
TraitType
.
class
,
constructHierarchicalType
(
TraitType
.
class
,
traitNameToDefMap
.
get
(
traitType
.
getName
()));
traitNameToDefMap
.
get
(
traitType
.
getName
()));
addTraitName
(
traitType
.
getName
());
typeCategoriesToTypeNamesMap
.
put
(
DataTypes
.
TypeCategory
.
TRAIT
,
traitType
.
getName
());
}
}
for
(
ClassType
classType
:
classTypes
)
{
for
(
ClassType
classType
:
classTypes
)
{
constructHierarchicalType
(
ClassType
.
class
,
constructHierarchicalType
(
ClassType
.
class
,
classNameToDefMap
.
get
(
classType
.
getName
()));
classNameToDefMap
.
get
(
classType
.
getName
()));
typeCategoriesToTypeNamesMap
.
put
(
DataTypes
.
TypeCategory
.
CLASS
,
classType
.
getName
());
}
}
}
}
...
...
typesystem/src/test/java/org/apache/hadoop/metadata/typesystem/types/TypeSystemTest.java
View file @
5e815267
...
@@ -86,7 +86,7 @@ public class TypeSystemTest extends BaseTest {
...
@@ -86,7 +86,7 @@ public class TypeSystemTest extends BaseTest {
soxTrait
,
secTrait
,
financeTrait
),
soxTrait
,
secTrait
,
financeTrait
),
ImmutableList
.<
HierarchicalTypeDefinition
<
ClassType
>>
of
());
ImmutableList
.<
HierarchicalTypeDefinition
<
ClassType
>>
of
());
final
ImmutableList
<
String
>
traitsNames
=
getTypeSystem
().
getT
raitsNames
(
);
final
ImmutableList
<
String
>
traitsNames
=
getTypeSystem
().
getT
ypeNamesByCategory
(
DataTypes
.
TypeCategory
.
TRAIT
);
Assert
.
assertEquals
(
traitsNames
.
size
(),
7
);
Assert
.
assertEquals
(
traitsNames
.
size
(),
7
);
List
traits
=
Arrays
.
asList
(
new
String
[]{
List
traits
=
Arrays
.
asList
(
new
String
[]{
"Classification"
,
"Classification"
,
...
...
webapp/src/main/java/org/apache/hadoop/metadata/web/listeners/GuiceServletConfig.java
View file @
5e815267
...
@@ -20,7 +20,11 @@ package org.apache.hadoop.metadata.web.listeners;
...
@@ -20,7 +20,11 @@ package org.apache.hadoop.metadata.web.listeners;
import
com.google.inject.Guice
;
import
com.google.inject.Guice
;
import
com.google.inject.Injector
;
import
com.google.inject.Injector
;
import
com.google.inject.TypeLiteral
;
import
com.google.inject.matcher.Matchers
;
import
com.google.inject.servlet.GuiceServletContextListener
;
import
com.google.inject.servlet.GuiceServletContextListener
;
import
com.google.inject.spi.TypeEncounter
;
import
com.google.inject.spi.TypeListener
;
import
com.sun.jersey.api.core.PackagesResourceConfig
;
import
com.sun.jersey.api.core.PackagesResourceConfig
;
import
com.sun.jersey.guice.JerseyServletModule
;
import
com.sun.jersey.guice.JerseyServletModule
;
import
com.sun.jersey.guice.spi.container.servlet.GuiceContainer
;
import
com.sun.jersey.guice.spi.container.servlet.GuiceContainer
;
...
@@ -41,6 +45,8 @@ import javax.servlet.ServletContextEvent;
...
@@ -41,6 +45,8 @@ import javax.servlet.ServletContextEvent;
import
java.util.HashMap
;
import
java.util.HashMap
;
import
java.util.Map
;
import
java.util.Map
;
import
static
com
.
google
.
inject
.
matcher
.
Matchers
.*;
public
class
GuiceServletConfig
extends
GuiceServletContextListener
{
public
class
GuiceServletConfig
extends
GuiceServletContextListener
{
private
static
final
Logger
LOG
=
LoggerFactory
.
getLogger
(
GuiceServletConfig
.
class
);
private
static
final
Logger
LOG
=
LoggerFactory
.
getLogger
(
GuiceServletConfig
.
class
);
...
@@ -105,22 +111,6 @@ public class GuiceServletConfig extends GuiceServletContextListener {
...
@@ -105,22 +111,6 @@ public class GuiceServletConfig extends GuiceServletContextListener {
// perform login operations
// perform login operations
LoginProcessor
loginProcessor
=
new
LoginProcessor
();
LoginProcessor
loginProcessor
=
new
LoginProcessor
();
loginProcessor
.
login
();
loginProcessor
.
login
();
restoreTypeSystem
();
}
private
void
restoreTypeSystem
()
{
LOG
.
info
(
"Restoring type system from the store"
);
Injector
injector
=
getInjector
();
ITypeStore
typeStore
=
injector
.
getInstance
(
ITypeStore
.
class
);
try
{
TypesDef
typesDef
=
typeStore
.
restore
();
TypeSystem
typeSystem
=
injector
.
getInstance
(
TypeSystem
.
class
);
typeSystem
.
defineTypes
(
typesDef
);
}
catch
(
MetadataException
e
)
{
throw
new
RuntimeException
(
e
);
}
LOG
.
info
(
"Restored type system from the store"
);
}
}
@Override
@Override
...
...
webapp/src/main/java/org/apache/hadoop/metadata/web/resources/EntityResource.java
View file @
5e815267
...
@@ -32,16 +32,7 @@ import org.slf4j.LoggerFactory;
...
@@ -32,16 +32,7 @@ import org.slf4j.LoggerFactory;
import
javax.inject.Inject
;
import
javax.inject.Inject
;
import
javax.inject.Singleton
;
import
javax.inject.Singleton
;
import
javax.servlet.http.HttpServletRequest
;
import
javax.servlet.http.HttpServletRequest
;
import
javax.ws.rs.Consumes
;
import
javax.ws.rs.*
;
import
javax.ws.rs.DefaultValue
;
import
javax.ws.rs.GET
;
import
javax.ws.rs.POST
;
import
javax.ws.rs.PUT
;
import
javax.ws.rs.Path
;
import
javax.ws.rs.PathParam
;
import
javax.ws.rs.Produces
;
import
javax.ws.rs.QueryParam
;
import
javax.ws.rs.WebApplicationException
;
import
javax.ws.rs.core.Context
;
import
javax.ws.rs.core.Context
;
import
javax.ws.rs.core.MediaType
;
import
javax.ws.rs.core.MediaType
;
import
javax.ws.rs.core.Response
;
import
javax.ws.rs.core.Response
;
...
@@ -80,7 +71,6 @@ public class EntityResource {
...
@@ -80,7 +71,6 @@ public class EntityResource {
* Submits an entity definition (instance) corresponding to a given type.
* Submits an entity definition (instance) corresponding to a given type.
*/
*/
@POST
@POST
@Path
(
"submit"
)
@Consumes
(
MediaType
.
APPLICATION_JSON
)
@Consumes
(
MediaType
.
APPLICATION_JSON
)
@Produces
(
MediaType
.
APPLICATION_JSON
)
@Produces
(
MediaType
.
APPLICATION_JSON
)
public
Response
submit
(
@Context
HttpServletRequest
request
)
{
public
Response
submit
(
@Context
HttpServletRequest
request
)
{
...
@@ -111,7 +101,7 @@ public class EntityResource {
...
@@ -111,7 +101,7 @@ public class EntityResource {
* @param guid GUID for the entity
* @param guid GUID for the entity
*/
*/
@GET
@GET
@Path
(
"
definition/
{guid}"
)
@Path
(
"{guid}"
)
@Produces
(
MediaType
.
APPLICATION_JSON
)
@Produces
(
MediaType
.
APPLICATION_JSON
)
public
Response
getEntityDefinition
(
@PathParam
(
"guid"
)
String
guid
)
{
public
Response
getEntityDefinition
(
@PathParam
(
"guid"
)
String
guid
)
{
Preconditions
.
checkNotNull
(
guid
,
"Entity GUID cannot be null"
);
Preconditions
.
checkNotNull
(
guid
,
"Entity GUID cannot be null"
);
...
@@ -157,9 +147,8 @@ public class EntityResource {
...
@@ -157,9 +147,8 @@ public class EntityResource {
* @param resultsPerPage number of results for pagination
* @param resultsPerPage number of results for pagination
*/
*/
@GET
@GET
@Path
(
"list/{entityType}"
)
@Produces
(
MediaType
.
APPLICATION_JSON
)
@Produces
(
MediaType
.
APPLICATION_JSON
)
public
Response
getEntityList
(
@PathParam
(
"entityT
ype"
)
String
entityType
,
public
Response
getEntityList
ByType
(
@QueryParam
(
"t
ype"
)
String
entityType
,
@DefaultValue
(
"0"
)
@QueryParam
(
"offset"
)
Integer
offset
,
@DefaultValue
(
"0"
)
@QueryParam
(
"offset"
)
Integer
offset
,
@QueryParam
(
"numResults"
)
Integer
resultsPerPage
)
{
@QueryParam
(
"numResults"
)
Integer
resultsPerPage
)
{
Preconditions
.
checkNotNull
(
entityType
,
"Entity type cannot be null"
);
Preconditions
.
checkNotNull
(
entityType
,
"Entity type cannot be null"
);
...
@@ -193,7 +182,7 @@ public class EntityResource {
...
@@ -193,7 +182,7 @@ public class EntityResource {
* @return response payload as json
* @return response payload as json
*/
*/
@PUT
@PUT
@Path
(
"
update/
{guid}"
)
@Path
(
"{guid}"
)
@Produces
(
MediaType
.
APPLICATION_JSON
)
@Produces
(
MediaType
.
APPLICATION_JSON
)
public
Response
update
(
@PathParam
(
"guid"
)
String
guid
,
public
Response
update
(
@PathParam
(
"guid"
)
String
guid
,
@QueryParam
(
"property"
)
String
property
,
@QueryParam
(
"property"
)
String
property
,
...
@@ -223,7 +212,7 @@ public class EntityResource {
...
@@ -223,7 +212,7 @@ public class EntityResource {
* @return a list of trait names for the given entity guid
* @return a list of trait names for the given entity guid
*/
*/
@GET
@GET
@Path
(
"
traits/list/{guid}
"
)
@Path
(
"
{guid}/traits
"
)
@Produces
(
MediaType
.
APPLICATION_JSON
)
@Produces
(
MediaType
.
APPLICATION_JSON
)
public
Response
getTraitNames
(
@PathParam
(
"guid"
)
String
guid
)
{
public
Response
getTraitNames
(
@PathParam
(
"guid"
)
String
guid
)
{
Preconditions
.
checkNotNull
(
guid
,
"Entity GUID cannot be null"
);
Preconditions
.
checkNotNull
(
guid
,
"Entity GUID cannot be null"
);
...
@@ -256,7 +245,7 @@ public class EntityResource {
...
@@ -256,7 +245,7 @@ public class EntityResource {
* @param guid globally unique identifier for the entity
* @param guid globally unique identifier for the entity
*/
*/
@POST
@POST
@Path
(
"
traits/add/{guid}
"
)
@Path
(
"
{guid}/traits
"
)
@Consumes
(
MediaType
.
APPLICATION_JSON
)
@Consumes
(
MediaType
.
APPLICATION_JSON
)
@Produces
(
MediaType
.
APPLICATION_JSON
)
@Produces
(
MediaType
.
APPLICATION_JSON
)
public
Response
addTrait
(
@Context
HttpServletRequest
request
,
public
Response
addTrait
(
@Context
HttpServletRequest
request
,
...
@@ -291,8 +280,8 @@ public class EntityResource {
...
@@ -291,8 +280,8 @@ public class EntityResource {
* @param guid globally unique identifier for the entity
* @param guid globally unique identifier for the entity
* @param traitName name of the trait
* @param traitName name of the trait
*/
*/
@
PUT
@
DELETE
@Path
(
"
traits/delete/{guid}
/{traitName}"
)
@Path
(
"
{guid}/traits
/{traitName}"
)
@Consumes
(
MediaType
.
APPLICATION_JSON
)
@Consumes
(
MediaType
.
APPLICATION_JSON
)
@Produces
(
MediaType
.
APPLICATION_JSON
)
@Produces
(
MediaType
.
APPLICATION_JSON
)
public
Response
deleteTrait
(
@Context
HttpServletRequest
request
,
public
Response
deleteTrait
(
@Context
HttpServletRequest
request
,
...
@@ -312,11 +301,11 @@ public class EntityResource {
...
@@ -312,11 +301,11 @@ public class EntityResource {
return
Response
.
ok
(
response
).
build
();
return
Response
.
ok
(
response
).
build
();
}
catch
(
MetadataException
|
IllegalArgumentException
e
)
{
}
catch
(
MetadataException
|
IllegalArgumentException
e
)
{
LOG
.
error
(
"Unable to
add
trait name={} for entity={}"
,
traitName
,
guid
,
e
);
LOG
.
error
(
"Unable to
delete
trait name={} for entity={}"
,
traitName
,
guid
,
e
);
throw
new
WebApplicationException
(
throw
new
WebApplicationException
(
Servlets
.
getErrorResponse
(
e
,
Response
.
Status
.
BAD_REQUEST
));
Servlets
.
getErrorResponse
(
e
,
Response
.
Status
.
BAD_REQUEST
));
}
catch
(
JSONException
e
)
{
}
catch
(
JSONException
e
)
{
LOG
.
error
(
"Unable to
add
trait name={} for entity={}"
,
traitName
,
guid
,
e
);
LOG
.
error
(
"Unable to
delete
trait name={} for entity={}"
,
traitName
,
guid
,
e
);
throw
new
WebApplicationException
(
throw
new
WebApplicationException
(
Servlets
.
getErrorResponse
(
e
,
Response
.
Status
.
INTERNAL_SERVER_ERROR
));
Servlets
.
getErrorResponse
(
e
,
Response
.
Status
.
INTERNAL_SERVER_ERROR
));
}
}
...
...
webapp/src/main/java/org/apache/hadoop/metadata/web/resources/HiveLineageResource.java
View file @
5e815267
...
@@ -31,12 +31,7 @@ import org.slf4j.LoggerFactory;
...
@@ -31,12 +31,7 @@ import org.slf4j.LoggerFactory;
import
javax.inject.Inject
;
import
javax.inject.Inject
;
import
javax.inject.Singleton
;
import
javax.inject.Singleton
;
import
javax.servlet.http.HttpServletRequest
;
import
javax.servlet.http.HttpServletRequest
;
import
javax.ws.rs.Consumes
;
import
javax.ws.rs.*
;
import
javax.ws.rs.GET
;
import
javax.ws.rs.Path
;
import
javax.ws.rs.PathParam
;
import
javax.ws.rs.Produces
;
import
javax.ws.rs.WebApplicationException
;
import
javax.ws.rs.core.Context
;
import
javax.ws.rs.core.Context
;
import
javax.ws.rs.core.MediaType
;
import
javax.ws.rs.core.MediaType
;
import
javax.ws.rs.core.Response
;
import
javax.ws.rs.core.Response
;
...
@@ -69,7 +64,7 @@ public class HiveLineageResource {
...
@@ -69,7 +64,7 @@ public class HiveLineageResource {
* @param tableName table name
* @param tableName table name
*/
*/
@GET
@GET
@Path
(
"
inputs/{tableName}
"
)
@Path
(
"
table/{tableName}/inputs
"
)
@Consumes
(
MediaType
.
APPLICATION_JSON
)
@Consumes
(
MediaType
.
APPLICATION_JSON
)
@Produces
(
MediaType
.
APPLICATION_JSON
)
@Produces
(
MediaType
.
APPLICATION_JSON
)
public
Response
inputs
(
@Context
HttpServletRequest
request
,
public
Response
inputs
(
@Context
HttpServletRequest
request
,
...
@@ -103,11 +98,12 @@ public class HiveLineageResource {
...
@@ -103,11 +98,12 @@ public class HiveLineageResource {
* @param tableName table name
* @param tableName table name
*/
*/
@GET
@GET
@Path
(
"
outputs/{tableName}
"
)
@Path
(
"
table/{tableName}/outputs
"
)
@Consumes
(
MediaType
.
APPLICATION_JSON
)
@Consumes
(
MediaType
.
APPLICATION_JSON
)
@Produces
(
MediaType
.
APPLICATION_JSON
)
@Produces
(
MediaType
.
APPLICATION_JSON
)
public
Response
outputs
(
@Context
HttpServletRequest
request
,
public
Response
outputs
(
@Context
HttpServletRequest
request
,
@PathParam
(
"tableName"
)
String
tableName
)
{
@PathParam
(
"tableName"
)
String
tableName
)
{
Preconditions
.
checkNotNull
(
tableName
,
"table name cannot be null"
);
Preconditions
.
checkNotNull
(
tableName
,
"table name cannot be null"
);
LOG
.
info
(
"Fetching lineage outputs for tableName={}"
,
tableName
);
LOG
.
info
(
"Fetching lineage outputs for tableName={}"
,
tableName
);
...
@@ -137,11 +133,12 @@ public class HiveLineageResource {
...
@@ -137,11 +133,12 @@ public class HiveLineageResource {
* @param tableName table name
* @param tableName table name
*/
*/
@GET
@GET
@Path
(
"
schema/{tableName}
"
)
@Path
(
"
table/{tableName}/schema
"
)
@Consumes
(
MediaType
.
APPLICATION_JSON
)
@Consumes
(
MediaType
.
APPLICATION_JSON
)
@Produces
(
MediaType
.
APPLICATION_JSON
)
@Produces
(
MediaType
.
APPLICATION_JSON
)
public
Response
schema
(
@Context
HttpServletRequest
request
,
public
Response
schema
(
@Context
HttpServletRequest
request
,
@PathParam
(
"tableName"
)
String
tableName
)
{
@PathParam
(
"tableName"
)
String
tableName
)
{
Preconditions
.
checkNotNull
(
tableName
,
"table name cannot be null"
);
Preconditions
.
checkNotNull
(
tableName
,
"table name cannot be null"
);
LOG
.
info
(
"Fetching schema for tableName={}"
,
tableName
);
LOG
.
info
(
"Fetching schema for tableName={}"
,
tableName
);
...
...
webapp/src/main/java/org/apache/hadoop/metadata/web/resources/TypesResource.java
View file @
5e815267
...
@@ -18,9 +18,11 @@
...
@@ -18,9 +18,11 @@
package
org
.
apache
.
hadoop
.
metadata
.
web
.
resources
;
package
org
.
apache
.
hadoop
.
metadata
.
web
.
resources
;
import
com.google.common.base.Preconditions
;
import
org.apache.hadoop.metadata.MetadataException
;
import
org.apache.hadoop.metadata.MetadataException
;
import
org.apache.hadoop.metadata.MetadataServiceClient
;
import
org.apache.hadoop.metadata.MetadataServiceClient
;
import
org.apache.hadoop.metadata.services.MetadataService
;
import
org.apache.hadoop.metadata.services.MetadataService
;
import
org.apache.hadoop.metadata.typesystem.types.DataTypes
;
import
org.apache.hadoop.metadata.web.util.Servlets
;
import
org.apache.hadoop.metadata.web.util.Servlets
;
import
org.codehaus.jettison.json.JSONArray
;
import
org.codehaus.jettison.json.JSONArray
;
import
org.codehaus.jettison.json.JSONException
;
import
org.codehaus.jettison.json.JSONException
;
...
@@ -31,13 +33,7 @@ import org.slf4j.LoggerFactory;
...
@@ -31,13 +33,7 @@ import org.slf4j.LoggerFactory;
import
javax.inject.Inject
;
import
javax.inject.Inject
;
import
javax.inject.Singleton
;
import
javax.inject.Singleton
;
import
javax.servlet.http.HttpServletRequest
;
import
javax.servlet.http.HttpServletRequest
;
import
javax.ws.rs.Consumes
;
import
javax.ws.rs.*
;
import
javax.ws.rs.GET
;
import
javax.ws.rs.POST
;
import
javax.ws.rs.Path
;
import
javax.ws.rs.PathParam
;
import
javax.ws.rs.Produces
;
import
javax.ws.rs.WebApplicationException
;
import
javax.ws.rs.core.Context
;
import
javax.ws.rs.core.Context
;
import
javax.ws.rs.core.MediaType
;
import
javax.ws.rs.core.MediaType
;
import
javax.ws.rs.core.Response
;
import
javax.ws.rs.core.Response
;
...
@@ -59,6 +55,8 @@ public class TypesResource {
...
@@ -59,6 +55,8 @@ public class TypesResource {
private
final
MetadataService
metadataService
;
private
final
MetadataService
metadataService
;
static
final
String
TYPE_ALL
=
"all"
;
@Inject
@Inject
public
TypesResource
(
MetadataService
metadataService
)
{
public
TypesResource
(
MetadataService
metadataService
)
{
this
.
metadataService
=
metadataService
;
this
.
metadataService
=
metadataService
;
...
@@ -69,7 +67,6 @@ public class TypesResource {
...
@@ -69,7 +67,6 @@ public class TypesResource {
* domain. Could represent things like Hive Database, Hive Table, etc.
* domain. Could represent things like Hive Database, Hive Table, etc.
*/
*/
@POST
@POST
@Path
(
"submit"
)
@Consumes
(
MediaType
.
APPLICATION_JSON
)
@Consumes
(
MediaType
.
APPLICATION_JSON
)
@Produces
(
MediaType
.
APPLICATION_JSON
)
@Produces
(
MediaType
.
APPLICATION_JSON
)
public
Response
submit
(
@Context
HttpServletRequest
request
)
{
public
Response
submit
(
@Context
HttpServletRequest
request
)
{
...
@@ -97,7 +94,7 @@ public class TypesResource {
...
@@ -97,7 +94,7 @@ public class TypesResource {
* @param typeName name of a type which is unique.
* @param typeName name of a type which is unique.
*/
*/
@GET
@GET
@Path
(
"
definition/
{typeName}"
)
@Path
(
"{typeName}"
)
@Produces
(
MediaType
.
APPLICATION_JSON
)
@Produces
(
MediaType
.
APPLICATION_JSON
)
public
Response
getDefinition
(
@Context
HttpServletRequest
request
,
public
Response
getDefinition
(
@Context
HttpServletRequest
request
,
@PathParam
(
"typeName"
)
String
typeName
)
{
@PathParam
(
"typeName"
)
String
typeName
)
{
...
@@ -122,44 +119,31 @@ public class TypesResource {
...
@@ -122,44 +119,31 @@ public class TypesResource {
}
}
/**
/**
* Gets the list of type names registered in the type system.
*/
@GET
@Path
(
"list"
)
@Produces
(
MediaType
.
APPLICATION_JSON
)
public
Response
getTypeNames
(
@Context
HttpServletRequest
request
)
{
try
{
final
List
<
String
>
typeNamesList
=
metadataService
.
getTypeNamesList
();
JSONObject
response
=
new
JSONObject
();
response
.
put
(
MetadataServiceClient
.
RESULTS
,
new
JSONArray
(
typeNamesList
));
response
.
put
(
MetadataServiceClient
.
TOTAL_SIZE
,
typeNamesList
.
size
());
response
.
put
(
MetadataServiceClient
.
REQUEST_ID
,
Servlets
.
getRequestId
());
return
Response
.
ok
(
response
).
build
();
}
catch
(
Exception
e
)
{
LOG
.
error
(
"Unable to get types list"
,
e
);
throw
new
WebApplicationException
(
Servlets
.
getErrorResponse
(
e
,
Response
.
Status
.
BAD_REQUEST
));
}
}
/**
* Gets the list of trait type names registered in the type system.
* Gets the list of trait type names registered in the type system.
*/
*/
@GET
@GET
@Path
(
"traits/list"
)
@Produces
(
MediaType
.
APPLICATION_JSON
)
@Produces
(
MediaType
.
APPLICATION_JSON
)
public
Response
getTraitNames
(
@Context
HttpServletRequest
request
)
{
public
Response
getTypesByFilter
(
@Context
HttpServletRequest
request
,
@DefaultValue
(
TYPE_ALL
)
@QueryParam
(
"type"
)
String
type
)
{
try
{
try
{
final
List
<
String
>
traitNamesList
=
metadataService
.
getTraitNamesList
();
List
<
String
>
result
=
null
;
if
(
TYPE_ALL
.
equals
(
type
))
{
result
=
metadataService
.
getTypeNamesList
();
}
else
{
DataTypes
.
TypeCategory
typeCategory
=
DataTypes
.
TypeCategory
.
valueOf
(
type
);
result
=
metadataService
.
getTypeNamesByCategory
(
typeCategory
);
}
JSONObject
response
=
new
JSONObject
();
JSONObject
response
=
new
JSONObject
();
response
.
put
(
MetadataServiceClient
.
RESULTS
,
new
JSONArray
(
traitNamesLis
t
));
response
.
put
(
MetadataServiceClient
.
RESULTS
,
new
JSONArray
(
resul
t
));
response
.
put
(
MetadataServiceClient
.
TOTAL_SIZE
,
traitNamesLis
t
.
size
());
response
.
put
(
MetadataServiceClient
.
TOTAL_SIZE
,
resul
t
.
size
());
response
.
put
(
MetadataServiceClient
.
REQUEST_ID
,
Servlets
.
getRequestId
());
response
.
put
(
MetadataServiceClient
.
REQUEST_ID
,
Servlets
.
getRequestId
());
return
Response
.
ok
(
response
).
build
();
return
Response
.
ok
(
response
).
build
();
}
catch
(
IllegalArgumentException
ie
)
{
LOG
.
error
(
"Unsupported typeName while retrieving type list {}"
,
type
);
throw
new
WebApplicationException
(
Servlets
.
getErrorResponse
(
"Unsupported type "
+
type
,
Response
.
Status
.
BAD_REQUEST
));
}
catch
(
Exception
e
)
{
}
catch
(
Exception
e
)
{
LOG
.
error
(
"Unable to get types list"
,
e
);
LOG
.
error
(
"Unable to get types list"
,
e
);
throw
new
WebApplicationException
(
throw
new
WebApplicationException
(
...
...
webapp/src/test/java/org/apache/hadoop/metadata/web/resources/BaseResourceIT.java
View file @
5e815267
...
@@ -28,6 +28,8 @@ import org.apache.hadoop.metadata.typesystem.TypesDef;
...
@@ -28,6 +28,8 @@ import org.apache.hadoop.metadata.typesystem.TypesDef;
import
org.apache.hadoop.metadata.typesystem.json.InstanceSerialization
;
import
org.apache.hadoop.metadata.typesystem.json.InstanceSerialization
;
import
org.apache.hadoop.metadata.typesystem.json.TypesSerialization
;
import
org.apache.hadoop.metadata.typesystem.json.TypesSerialization
;
import
org.apache.hadoop.metadata.typesystem.persistence.Id
;
import
org.apache.hadoop.metadata.typesystem.persistence.Id
;
import
org.apache.hadoop.metadata.typesystem.types.ClassType
;
import
org.apache.hadoop.metadata.typesystem.types.HierarchicalTypeDefinition
;
import
org.codehaus.jettison.json.JSONObject
;
import
org.codehaus.jettison.json.JSONObject
;
import
org.testng.Assert
;
import
org.testng.Assert
;
import
org.testng.annotations.BeforeClass
;
import
org.testng.annotations.BeforeClass
;
...
@@ -59,13 +61,16 @@ public abstract class BaseResourceIT {
...
@@ -59,13 +61,16 @@ public abstract class BaseResourceIT {
}
}
protected
void
createType
(
TypesDef
typesDef
)
throws
Exception
{
protected
void
createType
(
TypesDef
typesDef
)
throws
Exception
{
String
typesAsJSON
=
TypesSerialization
.
toJson
(
typesDef
);
HierarchicalTypeDefinition
<
ClassType
>
sampleType
=
typesDef
.
classTypesAsJavaList
().
get
(
0
);
createType
(
typesAsJSON
);
if
(
serviceClient
.
getType
(
sampleType
.
typeName
)
==
null
)
{
String
typesAsJSON
=
TypesSerialization
.
toJson
(
typesDef
);
createType
(
typesAsJSON
);
}
}
}
protected
void
createType
(
String
typesAsJSON
)
throws
Exception
{
protected
void
createType
(
String
typesAsJSON
)
throws
Exception
{
WebResource
resource
=
service
WebResource
resource
=
service
.
path
(
"api/metadata/types
/submit
"
);
.
path
(
"api/metadata/types"
);
ClientResponse
clientResponse
=
resource
ClientResponse
clientResponse
=
resource
.
accept
(
MediaType
.
APPLICATION_JSON
)
.
accept
(
MediaType
.
APPLICATION_JSON
)
...
...
webapp/src/test/java/org/apache/hadoop/metadata/web/resources/EntityJerseyResourceIT.java
View file @
5e815267
...
@@ -66,6 +66,8 @@ public class EntityJerseyResourceIT extends BaseResourceIT {
...
@@ -66,6 +66,8 @@ public class EntityJerseyResourceIT extends BaseResourceIT {
private
static
final
String
DATABASE_NAME
=
"foo"
;
private
static
final
String
DATABASE_NAME
=
"foo"
;
private
static
final
String
TABLE_TYPE
=
"hive_table_type"
;
private
static
final
String
TABLE_TYPE
=
"hive_table_type"
;
private
static
final
String
TABLE_NAME
=
"bar"
;
private
static
final
String
TABLE_NAME
=
"bar"
;
private
static
final
String
TRAITS
=
"traits"
;
private
static
final
String
TRAIT
=
"trait"
;
private
Referenceable
tableInstance
;
private
Referenceable
tableInstance
;
private
Id
tableId
;
private
Id
tableId
;
...
@@ -148,11 +150,12 @@ public class EntityJerseyResourceIT extends BaseResourceIT {
...
@@ -148,11 +150,12 @@ public class EntityJerseyResourceIT extends BaseResourceIT {
final
String
definition
=
response
.
getString
(
MetadataServiceClient
.
RESULTS
);
final
String
definition
=
response
.
getString
(
MetadataServiceClient
.
RESULTS
);
Assert
.
assertNotNull
(
definition
);
Assert
.
assertNotNull
(
definition
);
LOG
.
debug
(
"tableInstanceAfterGet = "
+
definition
);
LOG
.
debug
(
"tableInstanceAfterGet = "
+
definition
);
InstanceSerialization
.
fromJsonReferenceable
(
definition
,
true
);
}
}
private
ClientResponse
addProperty
(
String
guid
,
String
property
,
String
value
)
{
private
ClientResponse
addProperty
(
String
guid
,
String
property
,
String
value
)
{
WebResource
resource
=
service
WebResource
resource
=
service
.
path
(
"api/metadata/entities
/update
"
)
.
path
(
"api/metadata/entities"
)
.
path
(
guid
);
.
path
(
guid
);
return
resource
.
queryParam
(
"property"
,
property
).
queryParam
(
"value"
,
value
)
return
resource
.
queryParam
(
"property"
,
property
).
queryParam
(
"value"
,
value
)
...
@@ -163,7 +166,7 @@ public class EntityJerseyResourceIT extends BaseResourceIT {
...
@@ -163,7 +166,7 @@ public class EntityJerseyResourceIT extends BaseResourceIT {
private
ClientResponse
getEntityDefinition
(
String
guid
)
{
private
ClientResponse
getEntityDefinition
(
String
guid
)
{
WebResource
resource
=
service
WebResource
resource
=
service
.
path
(
"api/metadata/entities
/definition
"
)
.
path
(
"api/metadata/entities"
)
.
path
(
guid
);
.
path
(
guid
);
return
resource
.
accept
(
MediaType
.
APPLICATION_JSON
)
return
resource
.
accept
(
MediaType
.
APPLICATION_JSON
)
.
type
(
MediaType
.
APPLICATION_JSON
)
.
type
(
MediaType
.
APPLICATION_JSON
)
...
@@ -182,7 +185,7 @@ public class EntityJerseyResourceIT extends BaseResourceIT {
...
@@ -182,7 +185,7 @@ public class EntityJerseyResourceIT extends BaseResourceIT {
@Test
@Test
public
void
testGetInvalidEntityDefinition
()
throws
Exception
{
public
void
testGetInvalidEntityDefinition
()
throws
Exception
{
WebResource
resource
=
service
WebResource
resource
=
service
.
path
(
"api/metadata/entities
/definition
"
)
.
path
(
"api/metadata/entities"
)
.
path
(
"blah"
);
.
path
(
"blah"
);
ClientResponse
clientResponse
=
resource
ClientResponse
clientResponse
=
resource
...
@@ -198,8 +201,8 @@ public class EntityJerseyResourceIT extends BaseResourceIT {
...
@@ -198,8 +201,8 @@ public class EntityJerseyResourceIT extends BaseResourceIT {
@Test
(
dependsOnMethods
=
"testSubmitEntity"
)
@Test
(
dependsOnMethods
=
"testSubmitEntity"
)
public
void
testGetEntityList
()
throws
Exception
{
public
void
testGetEntityList
()
throws
Exception
{
ClientResponse
clientResponse
=
service
ClientResponse
clientResponse
=
service
.
path
(
"api/metadata/entities
/list/
"
)
.
path
(
"api/metadata/entities"
)
.
path
(
TABLE_TYPE
)
.
queryParam
(
"type"
,
TABLE_TYPE
)
.
accept
(
MediaType
.
APPLICATION_JSON
)
.
accept
(
MediaType
.
APPLICATION_JSON
)
.
type
(
MediaType
.
APPLICATION_JSON
)
.
type
(
MediaType
.
APPLICATION_JSON
)
.
method
(
HttpMethod
.
GET
,
ClientResponse
.
class
);
.
method
(
HttpMethod
.
GET
,
ClientResponse
.
class
);
...
@@ -219,7 +222,8 @@ public class EntityJerseyResourceIT extends BaseResourceIT {
...
@@ -219,7 +222,8 @@ public class EntityJerseyResourceIT extends BaseResourceIT {
@Test
@Test
public
void
testGetEntityListForBadEntityType
()
throws
Exception
{
public
void
testGetEntityListForBadEntityType
()
throws
Exception
{
ClientResponse
clientResponse
=
service
ClientResponse
clientResponse
=
service
.
path
(
"api/metadata/entities/list/blah"
)
.
path
(
"api/metadata/entities"
)
.
queryParam
(
"type"
,
"blah"
)
.
accept
(
MediaType
.
APPLICATION_JSON
)
.
accept
(
MediaType
.
APPLICATION_JSON
)
.
type
(
MediaType
.
APPLICATION_JSON
)
.
type
(
MediaType
.
APPLICATION_JSON
)
.
method
(
HttpMethod
.
GET
,
ClientResponse
.
class
);
.
method
(
HttpMethod
.
GET
,
ClientResponse
.
class
);
...
@@ -235,7 +239,8 @@ public class EntityJerseyResourceIT extends BaseResourceIT {
...
@@ -235,7 +239,8 @@ public class EntityJerseyResourceIT extends BaseResourceIT {
addNewType
();
addNewType
();
ClientResponse
clientResponse
=
service
ClientResponse
clientResponse
=
service
.
path
(
"api/metadata/entities/list/test"
)
.
path
(
"api/metadata/entities"
)
.
queryParam
(
"type"
,
"test"
)
.
accept
(
MediaType
.
APPLICATION_JSON
)
.
accept
(
MediaType
.
APPLICATION_JSON
)
.
type
(
MediaType
.
APPLICATION_JSON
)
.
type
(
MediaType
.
APPLICATION_JSON
)
.
method
(
HttpMethod
.
GET
,
ClientResponse
.
class
);
.
method
(
HttpMethod
.
GET
,
ClientResponse
.
class
);
...
@@ -266,8 +271,9 @@ public class EntityJerseyResourceIT extends BaseResourceIT {
...
@@ -266,8 +271,9 @@ public class EntityJerseyResourceIT extends BaseResourceIT {
public
void
testGetTraitNames
()
throws
Exception
{
public
void
testGetTraitNames
()
throws
Exception
{
final
String
guid
=
tableId
.
_getId
();
final
String
guid
=
tableId
.
_getId
();
ClientResponse
clientResponse
=
service
ClientResponse
clientResponse
=
service
.
path
(
"api/metadata/entities
/traits/list
"
)
.
path
(
"api/metadata/entities"
)
.
path
(
guid
)
.
path
(
guid
)
.
path
(
TRAITS
)
.
accept
(
MediaType
.
APPLICATION_JSON
)
.
accept
(
MediaType
.
APPLICATION_JSON
)
.
type
(
MediaType
.
APPLICATION_JSON
)
.
type
(
MediaType
.
APPLICATION_JSON
)
.
method
(
HttpMethod
.
GET
,
ClientResponse
.
class
);
.
method
(
HttpMethod
.
GET
,
ClientResponse
.
class
);
...
@@ -299,8 +305,9 @@ public class EntityJerseyResourceIT extends BaseResourceIT {
...
@@ -299,8 +305,9 @@ public class EntityJerseyResourceIT extends BaseResourceIT {
final
String
guid
=
tableId
.
_getId
();
final
String
guid
=
tableId
.
_getId
();
ClientResponse
clientResponse
=
service
ClientResponse
clientResponse
=
service
.
path
(
"api/metadata/entities
/traits/add
"
)
.
path
(
"api/metadata/entities"
)
.
path
(
guid
)
.
path
(
guid
)
.
path
(
TRAITS
)
.
accept
(
MediaType
.
APPLICATION_JSON
)
.
accept
(
MediaType
.
APPLICATION_JSON
)
.
type
(
MediaType
.
APPLICATION_JSON
)
.
type
(
MediaType
.
APPLICATION_JSON
)
.
method
(
HttpMethod
.
POST
,
ClientResponse
.
class
,
traitInstanceAsJSON
);
.
method
(
HttpMethod
.
POST
,
ClientResponse
.
class
,
traitInstanceAsJSON
);
...
@@ -328,8 +335,9 @@ public class EntityJerseyResourceIT extends BaseResourceIT {
...
@@ -328,8 +335,9 @@ public class EntityJerseyResourceIT extends BaseResourceIT {
LOG
.
debug
(
"traitInstanceAsJSON = "
+
traitInstanceAsJSON
);
LOG
.
debug
(
"traitInstanceAsJSON = "
+
traitInstanceAsJSON
);
ClientResponse
clientResponse
=
service
ClientResponse
clientResponse
=
service
.
path
(
"api/metadata/entities
/traits/add
"
)
.
path
(
"api/metadata/entities"
)
.
path
(
"random"
)
.
path
(
"random"
)
.
path
(
TRAITS
)
.
accept
(
MediaType
.
APPLICATION_JSON
)
.
accept
(
MediaType
.
APPLICATION_JSON
)
.
type
(
MediaType
.
APPLICATION_JSON
)
.
type
(
MediaType
.
APPLICATION_JSON
)
.
method
(
HttpMethod
.
POST
,
ClientResponse
.
class
,
traitInstanceAsJSON
);
.
method
(
HttpMethod
.
POST
,
ClientResponse
.
class
,
traitInstanceAsJSON
);
...
@@ -343,12 +351,13 @@ public class EntityJerseyResourceIT extends BaseResourceIT {
...
@@ -343,12 +351,13 @@ public class EntityJerseyResourceIT extends BaseResourceIT {
final
String
guid
=
tableId
.
_getId
();
final
String
guid
=
tableId
.
_getId
();
ClientResponse
clientResponse
=
service
ClientResponse
clientResponse
=
service
.
path
(
"api/metadata/entities
/traits/delete
"
)
.
path
(
"api/metadata/entities"
)
.
path
(
guid
)
.
path
(
guid
)
.
path
(
TRAITS
)
.
path
(
traitName
)
.
path
(
traitName
)
.
accept
(
MediaType
.
APPLICATION_JSON
)
.
accept
(
MediaType
.
APPLICATION_JSON
)
.
type
(
MediaType
.
APPLICATION_JSON
)
.
type
(
MediaType
.
APPLICATION_JSON
)
.
method
(
HttpMethod
.
PUT
,
ClientResponse
.
class
);
.
method
(
HttpMethod
.
DELETE
,
ClientResponse
.
class
);
Assert
.
assertEquals
(
clientResponse
.
getStatus
(),
Response
.
Status
.
OK
.
getStatusCode
());
Assert
.
assertEquals
(
clientResponse
.
getStatus
(),
Response
.
Status
.
OK
.
getStatusCode
());
String
responseAsString
=
clientResponse
.
getEntity
(
String
.
class
);
String
responseAsString
=
clientResponse
.
getEntity
(
String
.
class
);
...
@@ -365,12 +374,13 @@ public class EntityJerseyResourceIT extends BaseResourceIT {
...
@@ -365,12 +374,13 @@ public class EntityJerseyResourceIT extends BaseResourceIT {
final
String
traitName
=
"blah_trait"
;
final
String
traitName
=
"blah_trait"
;
ClientResponse
clientResponse
=
service
ClientResponse
clientResponse
=
service
.
path
(
"api/metadata/entities
/traits/delete
"
)
.
path
(
"api/metadata/entities"
)
.
path
(
"random"
)
.
path
(
"random"
)
.
path
(
TRAITS
)
.
path
(
traitName
)
.
path
(
traitName
)
.
accept
(
MediaType
.
APPLICATION_JSON
)
.
accept
(
MediaType
.
APPLICATION_JSON
)
.
type
(
MediaType
.
APPLICATION_JSON
)
.
type
(
MediaType
.
APPLICATION_JSON
)
.
method
(
HttpMethod
.
PUT
,
ClientResponse
.
class
);
.
method
(
HttpMethod
.
DELETE
,
ClientResponse
.
class
);
Assert
.
assertEquals
(
clientResponse
.
getStatus
(),
Assert
.
assertEquals
(
clientResponse
.
getStatus
(),
Response
.
Status
.
BAD_REQUEST
.
getStatusCode
());
Response
.
Status
.
BAD_REQUEST
.
getStatusCode
());
}
}
...
@@ -410,7 +420,9 @@ public class EntityJerseyResourceIT extends BaseResourceIT {
...
@@ -410,7 +420,9 @@ public class EntityJerseyResourceIT extends BaseResourceIT {
new
AttributeDefinition
(
"serde2"
,
new
AttributeDefinition
(
"serde2"
,
"serdeType"
,
Multiplicity
.
REQUIRED
,
false
,
null
),
"serdeType"
,
Multiplicity
.
REQUIRED
,
false
,
null
),
new
AttributeDefinition
(
"database"
,
new
AttributeDefinition
(
"database"
,
DATABASE_TYPE
,
Multiplicity
.
REQUIRED
,
true
,
null
));
DATABASE_TYPE
,
Multiplicity
.
REQUIRED
,
true
,
null
),
new
AttributeDefinition
(
"compressed"
,
DataTypes
.
BOOLEAN_TYPE
.
getName
(),
Multiplicity
.
OPTIONAL
,
true
,
null
));
HierarchicalTypeDefinition
<
TraitType
>
classificationTraitDefinition
=
HierarchicalTypeDefinition
<
TraitType
>
classificationTraitDefinition
=
TypesUtil
.
createTraitTypeDef
(
"classification"
,
TypesUtil
.
createTraitTypeDef
(
"classification"
,
...
@@ -451,6 +463,7 @@ public class EntityJerseyResourceIT extends BaseResourceIT {
...
@@ -451,6 +463,7 @@ public class EntityJerseyResourceIT extends BaseResourceIT {
tableInstance
.
set
(
"level"
,
2
);
tableInstance
.
set
(
"level"
,
2
);
tableInstance
.
set
(
"tableType"
,
1
);
// enum
tableInstance
.
set
(
"tableType"
,
1
);
// enum
tableInstance
.
set
(
"database"
,
databaseInstance
);
tableInstance
.
set
(
"database"
,
databaseInstance
);
tableInstance
.
set
(
"compressed"
,
false
);
Struct
traitInstance
=
(
Struct
)
tableInstance
.
getTrait
(
"classification"
);
Struct
traitInstance
=
(
Struct
)
tableInstance
.
getTrait
(
"classification"
);
traitInstance
.
set
(
"tag"
,
"foundation_etl"
);
traitInstance
.
set
(
"tag"
,
"foundation_etl"
);
...
...
webapp/src/test/java/org/apache/hadoop/metadata/web/resources/HiveLineageJerseyResourceIT.java
View file @
5e815267
...
@@ -53,6 +53,8 @@ import java.util.List;
...
@@ -53,6 +53,8 @@ import java.util.List;
*/
*/
public
class
HiveLineageJerseyResourceIT
extends
BaseResourceIT
{
public
class
HiveLineageJerseyResourceIT
extends
BaseResourceIT
{
private
static
final
String
BASE_URI
=
"api/metadata/lineage/hive/table/"
;
@BeforeClass
@BeforeClass
public
void
setUp
()
throws
Exception
{
public
void
setUp
()
throws
Exception
{
super
.
setUp
();
super
.
setUp
();
...
@@ -64,8 +66,9 @@ public class HiveLineageJerseyResourceIT extends BaseResourceIT {
...
@@ -64,8 +66,9 @@ public class HiveLineageJerseyResourceIT extends BaseResourceIT {
@Test
@Test
public
void
testInputs
()
throws
Exception
{
public
void
testInputs
()
throws
Exception
{
WebResource
resource
=
service
WebResource
resource
=
service
.
path
(
"api/metadata/lineage/hive/inputs"
)
.
path
(
BASE_URI
)
.
path
(
"sales_fact_monthly_mv"
);
.
path
(
"sales_fact_monthly_mv"
)
.
path
(
"inputs"
);
ClientResponse
clientResponse
=
resource
ClientResponse
clientResponse
=
resource
.
accept
(
MediaType
.
APPLICATION_JSON
)
.
accept
(
MediaType
.
APPLICATION_JSON
)
...
@@ -94,8 +97,9 @@ public class HiveLineageJerseyResourceIT extends BaseResourceIT {
...
@@ -94,8 +97,9 @@ public class HiveLineageJerseyResourceIT extends BaseResourceIT {
@Test
@Test
public
void
testOutputs
()
throws
Exception
{
public
void
testOutputs
()
throws
Exception
{
WebResource
resource
=
service
WebResource
resource
=
service
.
path
(
"api/metadata/lineage/hive/outputs"
)
.
path
(
BASE_URI
)
.
path
(
"sales_fact"
);
.
path
(
"sales_fact"
)
.
path
(
"outputs"
);
ClientResponse
clientResponse
=
resource
ClientResponse
clientResponse
=
resource
.
accept
(
MediaType
.
APPLICATION_JSON
)
.
accept
(
MediaType
.
APPLICATION_JSON
)
...
@@ -124,8 +128,9 @@ public class HiveLineageJerseyResourceIT extends BaseResourceIT {
...
@@ -124,8 +128,9 @@ public class HiveLineageJerseyResourceIT extends BaseResourceIT {
@Test
@Test
public
void
testSchema
()
throws
Exception
{
public
void
testSchema
()
throws
Exception
{
WebResource
resource
=
service
WebResource
resource
=
service
.
path
(
"api/metadata/lineage/hive/schema"
)
.
path
(
BASE_URI
)
.
path
(
"sales_fact"
);
.
path
(
"sales_fact"
)
.
path
(
"schema"
);
ClientResponse
clientResponse
=
resource
ClientResponse
clientResponse
=
resource
.
accept
(
MediaType
.
APPLICATION_JSON
)
.
accept
(
MediaType
.
APPLICATION_JSON
)
...
...
webapp/src/test/java/org/apache/hadoop/metadata/web/resources/TypesJerseyResourceIT.java
View file @
5e815267
...
@@ -70,7 +70,7 @@ public class TypesJerseyResourceIT extends BaseResourceIT {
...
@@ -70,7 +70,7 @@ public class TypesJerseyResourceIT extends BaseResourceIT {
System
.
out
.
println
(
"typesAsJSON = "
+
typesAsJSON
);
System
.
out
.
println
(
"typesAsJSON = "
+
typesAsJSON
);
WebResource
resource
=
service
WebResource
resource
=
service
.
path
(
"api/metadata/types
/submit
"
);
.
path
(
"api/metadata/types"
);
ClientResponse
clientResponse
=
resource
ClientResponse
clientResponse
=
resource
.
accept
(
MediaType
.
APPLICATION_JSON
)
.
accept
(
MediaType
.
APPLICATION_JSON
)
...
@@ -93,7 +93,7 @@ public class TypesJerseyResourceIT extends BaseResourceIT {
...
@@ -93,7 +93,7 @@ public class TypesJerseyResourceIT extends BaseResourceIT {
System
.
out
.
println
(
"typeName = "
+
typeDefinition
.
typeName
);
System
.
out
.
println
(
"typeName = "
+
typeDefinition
.
typeName
);
WebResource
resource
=
service
WebResource
resource
=
service
.
path
(
"api/metadata/types
/definition
"
)
.
path
(
"api/metadata/types"
)
.
path
(
typeDefinition
.
typeName
);
.
path
(
typeDefinition
.
typeName
);
ClientResponse
clientResponse
=
resource
ClientResponse
clientResponse
=
resource
...
@@ -114,7 +114,7 @@ public class TypesJerseyResourceIT extends BaseResourceIT {
...
@@ -114,7 +114,7 @@ public class TypesJerseyResourceIT extends BaseResourceIT {
@Test
@Test
public
void
testGetDefinitionForNonexistentType
()
throws
Exception
{
public
void
testGetDefinitionForNonexistentType
()
throws
Exception
{
WebResource
resource
=
service
WebResource
resource
=
service
.
path
(
"api/metadata/types
/definition
"
)
.
path
(
"api/metadata/types"
)
.
path
(
"blah"
);
.
path
(
"blah"
);
ClientResponse
clientResponse
=
resource
ClientResponse
clientResponse
=
resource
...
@@ -127,7 +127,7 @@ public class TypesJerseyResourceIT extends BaseResourceIT {
...
@@ -127,7 +127,7 @@ public class TypesJerseyResourceIT extends BaseResourceIT {
@Test
(
dependsOnMethods
=
"testSubmit"
)
@Test
(
dependsOnMethods
=
"testSubmit"
)
public
void
testGetTypeNames
()
throws
Exception
{
public
void
testGetTypeNames
()
throws
Exception
{
WebResource
resource
=
service
WebResource
resource
=
service
.
path
(
"api/metadata/types
/list
"
);
.
path
(
"api/metadata/types"
);
ClientResponse
clientResponse
=
resource
ClientResponse
clientResponse
=
resource
.
accept
(
MediaType
.
APPLICATION_JSON
)
.
accept
(
MediaType
.
APPLICATION_JSON
)
...
@@ -150,9 +150,10 @@ public class TypesJerseyResourceIT extends BaseResourceIT {
...
@@ -150,9 +150,10 @@ public class TypesJerseyResourceIT extends BaseResourceIT {
String
[]
traitsAdded
=
addTraits
();
String
[]
traitsAdded
=
addTraits
();
WebResource
resource
=
service
WebResource
resource
=
service
.
path
(
"api/metadata/types
/traits/list
"
);
.
path
(
"api/metadata/types"
);
ClientResponse
clientResponse
=
resource
ClientResponse
clientResponse
=
resource
.
queryParam
(
"type"
,
DataTypes
.
TypeCategory
.
TRAIT
.
name
())
.
accept
(
MediaType
.
APPLICATION_JSON
)
.
accept
(
MediaType
.
APPLICATION_JSON
)
.
type
(
MediaType
.
APPLICATION_JSON
)
.
type
(
MediaType
.
APPLICATION_JSON
)
.
method
(
HttpMethod
.
GET
,
ClientResponse
.
class
);
.
method
(
HttpMethod
.
GET
,
ClientResponse
.
class
);
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment