Skip to content
Projects
Groups
Snippets
Help
This project
Loading...
Sign in / Register
Toggle navigation
A
atlas
Project
Overview
Details
Activity
Cycle Analytics
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Charts
Issues
0
Issues
0
List
Board
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Charts
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Charts
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
dataplatform
atlas
Commits
e15629c2
Commit
e15629c2
authored
Jul 19, 2016
by
Madhan Neethiraj
Committed by
Suma Shivaprasad
Jul 20, 2016
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
ATLAS-1033: fix for issues flagged by Coverity scan
parent
b7f5995a
Hide whitespace changes
Inline
Side-by-side
Showing
30 changed files
with
326 additions
and
254 deletions
+326
-254
FalconBridge.java
...ain/java/org/apache/atlas/falcon/bridge/FalconBridge.java
+20
-11
HiveHook.java
...ge/src/main/java/org/apache/atlas/hive/hook/HiveHook.java
+47
-33
PolicyParser.java
.../java/org/apache/atlas/authorize/simple/PolicyParser.java
+2
-11
SimpleAtlasAuthorizer.java
.../apache/atlas/authorize/simple/SimpleAtlasAuthorizer.java
+7
-4
AtlasClient.java
client/src/main/java/org/apache/atlas/AtlasClient.java
+14
-4
AtlasServiceException.java
...src/main/java/org/apache/atlas/AtlasServiceException.java
+2
-2
ParamChecker.java
...on/src/main/java/org/apache/atlas/utils/ParamChecker.java
+0
-14
FailedMessagesLogger.java
...main/java/org/apache/atlas/hook/FailedMessagesLogger.java
+9
-7
release-log.txt
release-log.txt
+1
-0
DataSetLineageService.java
...ava/org/apache/atlas/discovery/DataSetLineageService.java
+6
-6
GraphBackedDiscoveryService.java
...he/atlas/discovery/graph/GraphBackedDiscoveryService.java
+5
-0
DeleteHandler.java
...java/org/apache/atlas/repository/graph/DeleteHandler.java
+22
-21
GraphBackedMetadataRepository.java
...atlas/repository/graph/GraphBackedMetadataRepository.java
+6
-4
GraphHelper.java
...n/java/org/apache/atlas/repository/graph/GraphHelper.java
+16
-8
TypedInstanceToGraphMapper.java
...he/atlas/repository/graph/TypedInstanceToGraphMapper.java
+16
-10
GraphBackedTypeStore.java
...ache/atlas/repository/typestore/GraphBackedTypeStore.java
+10
-6
DefaultMetadataService.java
...ava/org/apache/atlas/services/DefaultMetadataService.java
+30
-33
HBaseKeyColumnValueStore.java
...ius/titan/diskstorage/hbase/HBaseKeyColumnValueStore.java
+4
-3
Id.java
...main/java/org/apache/atlas/typesystem/persistence/Id.java
+3
-3
AbstractDataType.java
...a/org/apache/atlas/typesystem/types/AbstractDataType.java
+9
-3
HierarchicalType.java
...a/org/apache/atlas/typesystem/types/HierarchicalType.java
+14
-11
ObjectGraphTraversal.java
...g/apache/atlas/typesystem/types/ObjectGraphTraversal.java
+3
-1
ObjectGraphWalker.java
.../org/apache/atlas/typesystem/types/ObjectGraphWalker.java
+3
-1
Atlas.java
webapp/src/main/java/org/apache/atlas/Atlas.java
+5
-2
QuickStart.java
...p/src/main/java/org/apache/atlas/examples/QuickStart.java
+1
-1
CredentialProviderUtility.java
...java/org/apache/atlas/util/CredentialProviderUtility.java
+34
-27
UserDao.java
webapp/src/main/java/org/apache/atlas/web/dao/UserDao.java
+13
-1
EntityResource.java
...n/java/org/apache/atlas/web/resources/EntityResource.java
+5
-5
MetadataDiscoveryResource.java
...apache/atlas/web/resources/MetadataDiscoveryResource.java
+6
-6
AtlasAuthenticationProvider.java
...pache/atlas/web/security/AtlasAuthenticationProvider.java
+13
-16
No files found.
addons/falcon-bridge/src/main/java/org/apache/atlas/falcon/bridge/FalconBridge.java
View file @
e15629c2
...
...
@@ -28,6 +28,7 @@ import org.apache.atlas.hive.bridge.HiveMetaStoreBridge;
import
org.apache.atlas.hive.model.HiveDataModelGenerator
;
import
org.apache.atlas.hive.model.HiveDataTypes
;
import
org.apache.atlas.typesystem.Referenceable
;
import
org.apache.commons.collections.CollectionUtils
;
import
org.apache.commons.lang3.StringUtils
;
import
org.apache.falcon.entity.CatalogStorage
;
import
org.apache.falcon.entity.FeedHelper
;
...
...
@@ -284,18 +285,26 @@ public class FalconBridge {
Feed
feed
)
throws
Exception
{
org
.
apache
.
falcon
.
entity
.
v0
.
feed
.
Cluster
feedCluster
=
FeedHelper
.
getCluster
(
feed
,
cluster
.
getName
());
final
CatalogTable
table
=
getTable
(
feedCluster
,
feed
);
if
(
table
!=
null
)
{
CatalogStorage
storage
=
new
CatalogStorage
(
cluster
,
table
);
return
createHiveTableInstance
(
cluster
.
getName
(),
storage
.
getDatabase
().
toLowerCase
(),
storage
.
getTable
().
toLowerCase
());
}
else
{
List
<
Location
>
locations
=
FeedHelper
.
getLocations
(
feedCluster
,
feed
);
Location
dataLocation
=
FileSystemStorage
.
getLocation
(
locations
,
LocationType
.
DATA
);
final
String
pathUri
=
normalize
(
dataLocation
.
getPath
());
LOG
.
info
(
"Registering DFS Path {} "
,
pathUri
);
return
fillHDFSDataSet
(
pathUri
,
cluster
.
getName
());
if
(
feedCluster
!=
null
)
{
final
CatalogTable
table
=
getTable
(
feedCluster
,
feed
);
if
(
table
!=
null
)
{
CatalogStorage
storage
=
new
CatalogStorage
(
cluster
,
table
);
return
createHiveTableInstance
(
cluster
.
getName
(),
storage
.
getDatabase
().
toLowerCase
(),
storage
.
getTable
().
toLowerCase
());
}
else
{
List
<
Location
>
locations
=
FeedHelper
.
getLocations
(
feedCluster
,
feed
);
if
(
CollectionUtils
.
isNotEmpty
(
locations
))
{
Location
dataLocation
=
FileSystemStorage
.
getLocation
(
locations
,
LocationType
.
DATA
);
if
(
dataLocation
!=
null
)
{
final
String
pathUri
=
normalize
(
dataLocation
.
getPath
());
LOG
.
info
(
"Registering DFS Path {} "
,
pathUri
);
return
fillHDFSDataSet
(
pathUri
,
cluster
.
getName
());
}
}
}
}
return
null
;
}
private
static
CatalogTable
getTable
(
org
.
apache
.
falcon
.
entity
.
v0
.
feed
.
Cluster
cluster
,
Feed
feed
)
{
...
...
addons/hive-bridge/src/main/java/org/apache/atlas/hive/hook/HiveHook.java
View file @
e15629c2
...
...
@@ -56,6 +56,7 @@ import org.slf4j.Logger;
import
org.slf4j.LoggerFactory
;
import
java.net.MalformedURLException
;
import
java.net.URI
;
import
java.util.ArrayList
;
import
java.util.Comparator
;
import
java.util.Date
;
...
...
@@ -491,37 +492,42 @@ public class HiveHook extends AtlasHook implements ExecuteWithHookContext {
table
=
partition
.
getTable
();
db
=
dgiBridge
.
hiveClient
.
getDatabase
(
table
.
getDbName
());
break
;
default
:
LOG
.
info
(
"{}: entity-type not handled by Atlas hook. Ignored"
,
entity
.
getType
());
}
db
=
dgiBridge
.
hiveClient
.
getDatabase
(
db
.
getName
());
Referenceable
dbEntity
=
dgiBridge
.
createDBInstance
(
db
);
if
(
db
!=
null
)
{
db
=
dgiBridge
.
hiveClient
.
getDatabase
(
db
.
getName
());
Referenceable
dbEntity
=
dgiBridge
.
createDBInstance
(
db
);
entities
.
add
(
dbEntity
);
result
.
put
(
Type
.
DATABASE
,
dbEntity
);
entities
.
add
(
dbEntity
);
result
.
put
(
Type
.
DATABASE
,
dbEntity
);
Referenceable
tableEntity
=
null
;
Referenceable
tableEntity
=
null
;
if
(
table
!=
null
)
{
if
(
existTable
!=
null
)
{
table
=
existTable
;
}
else
{
table
=
dgiBridge
.
hiveClient
.
getTable
(
table
.
getDbName
(),
table
.
getTableName
());
if
(
table
!=
null
)
{
if
(
existTable
!=
null
)
{
table
=
existTable
;
}
else
{
table
=
dgiBridge
.
hiveClient
.
getTable
(
table
.
getDbName
(),
table
.
getTableName
());
}
//If its an external table, even though the temp table skip flag is on,
// we create the table since we need the HDFS path to temp table lineage.
if
(
skipTempTables
&&
table
.
isTemporary
()
&&
!
TableType
.
EXTERNAL_TABLE
.
equals
(
table
.
getTableType
()))
{
LOG
.
debug
(
"Skipping temporary table registration {} since it is not an external table {} "
,
table
.
getTableName
(),
table
.
getTableType
().
name
());
}
else
{
tableEntity
=
dgiBridge
.
createTableInstance
(
dbEntity
,
table
);
entities
.
add
(
tableEntity
);
result
.
put
(
Type
.
TABLE
,
tableEntity
);
}
}
//If its an external table, even though the temp table skip flag is on,
// we create the table since we need the HDFS path to temp table lineage.
if
(
skipTempTables
&&
table
.
isTemporary
()
&&
!
TableType
.
EXTERNAL_TABLE
.
equals
(
table
.
getTableType
()))
{
LOG
.
debug
(
"Skipping temporary table registration {} since it is not an external table {} "
,
table
.
getTableName
(),
table
.
getTableType
().
name
());
}
else
{
tableEntity
=
dgiBridge
.
createTableInstance
(
dbEntity
,
table
);
entities
.
add
(
tableEntity
);
result
.
put
(
Type
.
TABLE
,
tableEntity
);
}
event
.
addMessage
(
new
HookNotification
.
EntityUpdateRequest
(
event
.
getUser
(),
entities
));
}
event
.
addMessage
(
new
HookNotification
.
EntityUpdateRequest
(
event
.
getUser
(),
entities
));
return
result
;
}
...
...
@@ -620,13 +626,16 @@ public class HiveHook extends AtlasHook implements ExecuteWithHookContext {
entities
.
addAll
(
result
.
values
());
}
}
else
if
(
entity
.
getType
()
==
Type
.
DFS_DIR
)
{
final
String
pathUri
=
lower
(
new
Path
(
entity
.
getLocation
()).
toString
());
LOG
.
debug
(
"Registering DFS Path {} "
,
pathUri
);
if
(!
dataSetsProcessed
.
contains
(
pathUri
))
{
Referenceable
hdfsPath
=
dgiBridge
.
fillHDFSDataSet
(
pathUri
);
dataSets
.
put
(
entity
,
hdfsPath
);
dataSetsProcessed
.
add
(
pathUri
);
entities
.
add
(
hdfsPath
);
URI
location
=
entity
.
getLocation
();
if
(
location
!=
null
)
{
final
String
pathUri
=
lower
(
new
Path
(
location
).
toString
());
LOG
.
debug
(
"Registering DFS Path {} "
,
pathUri
);
if
(!
dataSetsProcessed
.
contains
(
pathUri
))
{
Referenceable
hdfsPath
=
dgiBridge
.
fillHDFSDataSet
(
pathUri
);
dataSets
.
put
(
entity
,
hdfsPath
);
dataSetsProcessed
.
add
(
pathUri
);
entities
.
add
(
hdfsPath
);
}
}
}
}
...
...
@@ -666,13 +675,17 @@ public class HiveHook extends AtlasHook implements ExecuteWithHookContext {
private
void
handleExternalTables
(
final
HiveMetaStoreBridge
dgiBridge
,
final
HiveEventContext
event
,
final
LinkedHashMap
<
Type
,
Referenceable
>
tables
)
throws
HiveException
,
MalformedURLException
{
List
<
Referenceable
>
entities
=
new
ArrayList
<>();
final
WriteEntity
hiveEntity
=
(
WriteEntity
)
getEntityByType
(
event
.
getOutputs
(),
Type
.
TABLE
);
Table
hiveTable
=
hiveEntity
.
getTable
();
Table
hiveTable
=
hiveEntity
==
null
?
null
:
hiveEntity
.
getTable
();
//Refresh to get the correct location
hiveTable
=
dgiBridge
.
hiveClient
.
getTable
(
hiveTable
.
getDbName
(),
hiveTable
.
getTableName
());
if
(
hiveTable
!=
null
)
{
hiveTable
=
dgiBridge
.
hiveClient
.
getTable
(
hiveTable
.
getDbName
(),
hiveTable
.
getTableName
());
}
final
String
location
=
lower
(
hiveTable
.
getDataLocation
().
toString
());
if
(
hiveTable
!=
null
&&
TableType
.
EXTERNAL_TABLE
.
equals
(
hiveTable
.
getTableType
()))
{
LOG
.
info
(
"Registering external table process {} "
,
event
.
getQueryStr
());
final
String
location
=
lower
(
hiveTable
.
getDataLocation
().
toString
());
final
ReadEntity
dfsEntity
=
new
ReadEntity
();
dfsEntity
.
setTyp
(
Type
.
DFS_DIR
);
dfsEntity
.
setName
(
location
);
...
...
@@ -702,6 +715,7 @@ public class HiveHook extends AtlasHook implements ExecuteWithHookContext {
entities
.
add
(
processReferenceable
);
event
.
addMessage
(
new
HookNotification
.
EntityUpdateRequest
(
event
.
getUser
(),
entities
));
}
}
private
boolean
isCreateOp
(
HiveEventContext
hiveEvent
)
{
...
...
authorization/src/main/java/org/apache/atlas/authorize/simple/PolicyParser.java
View file @
e15629c2
...
...
@@ -161,12 +161,7 @@ public class PolicyParser {
if
(
def
.
getUsers
()
!=
null
)
{
usersMap
=
def
.
getUsers
();
}
List
<
AtlasActionTypes
>
userAutorities
=
usersMap
.
get
(
userAndRole
[
USERNAME
]);
if
(
userAutorities
==
null
)
{
userAutorities
=
new
ArrayList
<
AtlasActionTypes
>();
}
userAutorities
=
getListOfAutorities
(
userAndRole
[
USER_AUTHORITIES
]);
List
<
AtlasActionTypes
>
userAutorities
=
getListOfAutorities
(
userAndRole
[
USER_AUTHORITIES
]);
usersMap
.
put
(
userAndRole
[
USERNAME
],
userAutorities
);
def
.
setUsers
(
usersMap
);
}
...
...
@@ -195,11 +190,7 @@ public class PolicyParser {
if
(
def
.
getGroups
()
!=
null
)
{
groupsMap
=
def
.
getGroups
();
}
List
<
AtlasActionTypes
>
groupAutorities
=
groupsMap
.
get
(
groupAndRole
[
GROUPNAME
]);
if
(
groupAutorities
==
null
)
{
groupAutorities
=
new
ArrayList
<
AtlasActionTypes
>();
}
groupAutorities
=
getListOfAutorities
(
groupAndRole
[
GROUP_AUTHORITIES
]);
List
<
AtlasActionTypes
>
groupAutorities
=
getListOfAutorities
(
groupAndRole
[
GROUP_AUTHORITIES
]);
groupsMap
.
put
(
groupAndRole
[
GROUPNAME
],
groupAutorities
);
def
.
setGroups
(
groupsMap
);
}
...
...
authorization/src/main/java/org/apache/atlas/authorize/simple/SimpleAtlasAuthorizer.java
View file @
e15629c2
...
...
@@ -32,6 +32,7 @@ import org.apache.atlas.authorize.AtlasAuthorizationException;
import
org.apache.atlas.authorize.AtlasAuthorizer
;
import
org.apache.atlas.authorize.AtlasResourceTypes
;
import
org.apache.atlas.utils.PropertiesUtil
;
import
org.apache.commons.collections.CollectionUtils
;
import
org.apache.commons.configuration.Configuration
;
import
org.apache.commons.io.FilenameUtils
;
import
org.apache.commons.io.IOCase
;
...
...
@@ -224,10 +225,12 @@ public final class SimpleAtlasAuthorizer implements AtlasAuthorizer {
LOG
.
debug
(
"==> SimpleAtlasAuthorizer checkAccessForGroups"
);
}
for
(
String
group
:
groups
)
{
isAccessAllowed
=
checkAccess
(
group
,
resourceType
,
resource
,
map
);
if
(
isAccessAllowed
)
{
break
;
if
(
CollectionUtils
.
isNotEmpty
(
groups
))
{
for
(
String
group
:
groups
)
{
isAccessAllowed
=
checkAccess
(
group
,
resourceType
,
resource
,
map
);
if
(
isAccessAllowed
)
{
break
;
}
}
}
...
...
client/src/main/java/org/apache/atlas/AtlasClient.java
View file @
e15629c2
...
...
@@ -144,8 +144,15 @@ public class AtlasClient {
// New constuctor for Basic auth
public
AtlasClient
(
String
[]
baseUrl
,
String
[]
basicAuthUserNamepassword
)
{
this
.
basicAuthUser
=
basicAuthUserNamepassword
[
0
];
this
.
basicAuthPassword
=
basicAuthUserNamepassword
[
1
];
if
(
basicAuthUserNamepassword
!=
null
)
{
if
(
basicAuthUserNamepassword
.
length
>
0
)
{
this
.
basicAuthUser
=
basicAuthUserNamepassword
[
0
];
}
if
(
basicAuthUserNamepassword
.
length
>
1
)
{
this
.
basicAuthPassword
=
basicAuthUserNamepassword
[
1
];
}
}
initializeState
(
baseUrl
,
null
,
null
);
}
...
...
@@ -1119,7 +1126,8 @@ public class AtlasClient {
private
JSONObject
callAPIWithResource
(
API
api
,
WebResource
resource
,
Object
requestObject
)
throws
AtlasServiceException
{
ClientResponse
clientResponse
=
null
;
for
(
int
i
=
0
;
i
<
getNumberOfRetries
();
i
++)
{
int
i
=
0
;
do
{
clientResponse
=
resource
.
accept
(
JSON_MEDIA_TYPE
).
type
(
JSON_MEDIA_TYPE
)
.
method
(
api
.
getMethod
(),
ClientResponse
.
class
,
requestObject
);
...
...
@@ -1137,7 +1145,9 @@ public class AtlasClient {
LOG
.
error
(
"Got a service unavailable when calling: {}, will retry.."
,
resource
);
sleepBetweenRetries
();
}
}
i
++;
}
while
(
i
<
getNumberOfRetries
());
throw
new
AtlasServiceException
(
api
,
clientResponse
);
}
...
...
client/src/main/java/org/apache/atlas/AtlasServiceException.java
View file @
e15629c2
...
...
@@ -37,8 +37,8 @@ public class AtlasServiceException extends Exception {
}
private
AtlasServiceException
(
AtlasClient
.
API
api
,
ClientResponse
.
Status
status
,
String
response
)
{
super
(
"Metadata service API "
+
api
+
" failed with status "
+
status
.
getStatusCode
()
+
"("
+
status
.
getReasonPhrase
()
+
") Response Body ("
+
response
+
")"
);
super
(
"Metadata service API "
+
api
+
" failed with status "
+
(
status
!=
null
?
status
.
getStatusCode
()
:
-
1
)
+
" ("
+
status
+
") Response Body ("
+
response
+
")"
);
this
.
status
=
status
;
}
...
...
common/src/main/java/org/apache/atlas/utils/ParamChecker.java
View file @
e15629c2
...
...
@@ -139,20 +139,6 @@ public final class ParamChecker {
}
/**
* Check that a list is not null and that none of its elements is null. If null or if the list has emtpy elements
* throws an IllegalArgumentException.
* @param list the list of strings.
* @param name parameter name for the exception message.
*/
public
static
Collection
<
String
>
notEmptyElements
(
Collection
<
String
>
list
,
String
name
)
{
notEmpty
(
list
,
name
);
for
(
String
ele
:
list
)
{
notEmpty
(
ele
,
String
.
format
(
"list %s element %s"
,
name
,
ele
));
}
return
list
;
}
/**
* Checks that the given value is <= max value.
* @param value
* @param maxValue
...
...
notification/src/main/java/org/apache/atlas/hook/FailedMessagesLogger.java
View file @
e15629c2
...
...
@@ -77,13 +77,15 @@ public class FailedMessagesLogger {
org
.
apache
.
log4j
.
Logger
rootLogger
=
org
.
apache
.
log4j
.
Logger
.
getRootLogger
();
Enumeration
allAppenders
=
rootLogger
.
getAllAppenders
();
while
(
allAppenders
.
hasMoreElements
())
{
Appender
appender
=
(
Appender
)
allAppenders
.
nextElement
();
if
(
appender
instanceof
FileAppender
)
{
FileAppender
fileAppender
=
(
FileAppender
)
appender
;
String
rootLoggerFile
=
fileAppender
.
getFile
();
rootLoggerDirectory
=
new
File
(
rootLoggerFile
).
getParent
();
break
;
if
(
allAppenders
!=
null
)
{
while
(
allAppenders
.
hasMoreElements
())
{
Appender
appender
=
(
Appender
)
allAppenders
.
nextElement
();
if
(
appender
instanceof
FileAppender
)
{
FileAppender
fileAppender
=
(
FileAppender
)
appender
;
String
rootLoggerFile
=
fileAppender
.
getFile
();
rootLoggerDirectory
=
new
File
(
rootLoggerFile
).
getParent
();
break
;
}
}
}
return
rootLoggerDirectory
;
...
...
release-log.txt
View file @
e15629c2
...
...
@@ -6,6 +6,7 @@ INCOMPATIBLE CHANGES:
ALL CHANGES:
ATLAS-1033 fix for issues flagged by Coverity scan (mneethiraj via sumasai)
ATLAS-1036 Compilation error on java 1.8 - GraphBackedDiscoveryService (shwethags via sumasai)
ATLAS-1034 Incorrect Falcon hook impl class name in Falcon hook shim (mneethiraj via shwethags)
ATLAS-347 Atlas search APIs should allow pagination of results (shwethags)
...
...
repository/src/main/java/org/apache/atlas/discovery/DataSetLineageService.java
View file @
e15629c2
...
...
@@ -101,7 +101,7 @@ public class DataSetLineageService implements LineageService {
@GraphTransaction
public
String
getOutputsGraph
(
String
datasetName
)
throws
AtlasException
{
LOG
.
info
(
"Fetching lineage outputs graph for datasetName={}"
,
datasetName
);
ParamChecker
.
notEmpty
(
datasetName
,
"dataset name"
);
datasetName
=
ParamChecker
.
notEmpty
(
datasetName
,
"dataset name"
);
ReferenceableInstance
datasetInstance
=
validateDatasetNameExists
(
datasetName
);
return
getOutputsGraphForId
(
datasetInstance
.
getId
().
_getId
());
}
...
...
@@ -116,7 +116,7 @@ public class DataSetLineageService implements LineageService {
@GraphTransaction
public
String
getInputsGraph
(
String
tableName
)
throws
AtlasException
{
LOG
.
info
(
"Fetching lineage inputs graph for tableName={}"
,
tableName
);
ParamChecker
.
notEmpty
(
tableName
,
"table name"
);
tableName
=
ParamChecker
.
notEmpty
(
tableName
,
"table name"
);
ReferenceableInstance
datasetInstance
=
validateDatasetNameExists
(
tableName
);
return
getInputsGraphForId
(
datasetInstance
.
getId
().
_getId
());
}
...
...
@@ -125,7 +125,7 @@ public class DataSetLineageService implements LineageService {
@GraphTransaction
public
String
getInputsGraphForEntity
(
String
guid
)
throws
AtlasException
{
LOG
.
info
(
"Fetching lineage inputs graph for entity={}"
,
guid
);
ParamChecker
.
notEmpty
(
guid
,
"Entity id"
);
guid
=
ParamChecker
.
notEmpty
(
guid
,
"Entity id"
);
validateDatasetExists
(
guid
);
return
getInputsGraphForId
(
guid
);
}
...
...
@@ -143,7 +143,7 @@ public class DataSetLineageService implements LineageService {
@GraphTransaction
public
String
getOutputsGraphForEntity
(
String
guid
)
throws
AtlasException
{
LOG
.
info
(
"Fetching lineage outputs graph for entity guid={}"
,
guid
);
ParamChecker
.
notEmpty
(
guid
,
"Entity id"
);
guid
=
ParamChecker
.
notEmpty
(
guid
,
"Entity id"
);
validateDatasetExists
(
guid
);
return
getOutputsGraphForId
(
guid
);
}
...
...
@@ -165,7 +165,7 @@ public class DataSetLineageService implements LineageService {
@Override
@GraphTransaction
public
String
getSchema
(
String
datasetName
)
throws
AtlasException
{
ParamChecker
.
notEmpty
(
datasetName
,
"table name"
);
datasetName
=
ParamChecker
.
notEmpty
(
datasetName
,
"table name"
);
LOG
.
info
(
"Fetching schema for tableName={}"
,
datasetName
);
ReferenceableInstance
datasetInstance
=
validateDatasetNameExists
(
datasetName
);
...
...
@@ -182,7 +182,7 @@ public class DataSetLineageService implements LineageService {
@Override
@GraphTransaction
public
String
getSchemaForEntity
(
String
guid
)
throws
AtlasException
{
ParamChecker
.
notEmpty
(
guid
,
"Entity id"
);
guid
=
ParamChecker
.
notEmpty
(
guid
,
"Entity id"
);
LOG
.
info
(
"Fetching schema for entity guid={}"
,
guid
);
String
typeName
=
validateDatasetExists
(
guid
);
return
getSchemaForId
(
typeName
,
guid
);
...
...
repository/src/main/java/org/apache/atlas/discovery/graph/GraphBackedDiscoveryService.java
View file @
e15629c2
...
...
@@ -167,6 +167,11 @@ public class GraphBackedDiscoveryService implements DiscoveryService {
LOG
.
info
(
"Executing gremlin query={}"
,
gremlinQuery
);
ScriptEngineManager
manager
=
new
ScriptEngineManager
();
ScriptEngine
engine
=
manager
.
getEngineByName
(
"gremlin-groovy"
);
if
(
engine
==
null
)
{
throw
new
DiscoveryException
(
"gremlin-groovy: engine not found"
);
}
Bindings
bindings
=
engine
.
createBindings
();
bindings
.
put
(
"g"
,
titanGraph
);
...
...
repository/src/main/java/org/apache/atlas/repository/graph/DeleteHandler.java
View file @
e15629c2
...
...
@@ -334,28 +334,29 @@ public abstract class DeleteHandler {
String
keyPropertyName
=
GraphHelper
.
getQualifiedNameForMapKey
(
propertyName
,
key
);
String
mapEdgeId
=
GraphHelper
.
getProperty
(
outVertex
,
keyPropertyName
);
Edge
mapEdge
=
graphHelper
.
getEdgeByEdgeId
(
outVertex
,
keyPropertyName
,
mapEdgeId
);
Vertex
mapVertex
=
mapEdge
.
getVertex
(
Direction
.
IN
);
if
(
mapVertex
.
getId
().
toString
().
equals
(
inVertex
.
getId
().
toString
()))
{
//TODO keys.size includes deleted items as well. should exclude
if
(
attributeInfo
.
multiplicity
.
nullAllowed
()
||
keys
.
size
()
>
attributeInfo
.
multiplicity
.
lower
)
{
edge
=
mapEdge
;
}
else
{
// Deleting this entry would violate the attribute's lower bound.
throw
new
NullRequiredAttributeException
(
"Cannot remove map entry "
+
keyPropertyName
+
" from required attribute "
+
GraphHelper
.
getQualifiedFieldName
(
type
,
attributeName
)
+
" on "
+
string
(
outVertex
)
+
" "
+
string
(
mapEdge
));
}
if
(
shouldUpdateReverseAttribute
)
{
//remove this key
LOG
.
debug
(
"Removing edge {}, key {} from the map attribute {}"
,
string
(
mapEdge
),
key
,
attributeName
);
keys
.
remove
(
key
);
GraphHelper
.
setProperty
(
outVertex
,
propertyName
,
keys
);
GraphHelper
.
setProperty
(
outVertex
,
keyPropertyName
,
null
);
if
(
mapEdge
!=
null
)
{
Vertex
mapVertex
=
mapEdge
.
getVertex
(
Direction
.
IN
);
if
(
mapVertex
.
getId
().
toString
().
equals
(
inVertex
.
getId
().
toString
()))
{
//TODO keys.size includes deleted items as well. should exclude
if
(
attributeInfo
.
multiplicity
.
nullAllowed
()
||
keys
.
size
()
>
attributeInfo
.
multiplicity
.
lower
)
{
edge
=
mapEdge
;
}
else
{
// Deleting this entry would violate the attribute's lower bound.
throw
new
NullRequiredAttributeException
(
"Cannot remove map entry "
+
keyPropertyName
+
" from required attribute "
+
GraphHelper
.
getQualifiedFieldName
(
type
,
attributeName
)
+
" on "
+
string
(
outVertex
)
+
" "
+
string
(
mapEdge
));
}
if
(
shouldUpdateReverseAttribute
)
{
//remove this key
LOG
.
debug
(
"Removing edge {}, key {} from the map attribute {}"
,
string
(
mapEdge
),
key
,
attributeName
);
keys
.
remove
(
key
);
GraphHelper
.
setProperty
(
outVertex
,
propertyName
,
keys
);
GraphHelper
.
setProperty
(
outVertex
,
keyPropertyName
,
null
);
}
break
;
}
break
;
}
}
}
...
...
repository/src/main/java/org/apache/atlas/repository/graph/GraphBackedMetadataRepository.java
View file @
e15629c2
...
...
@@ -268,11 +268,13 @@ public class GraphBackedMetadataRepository implements MetadataRepository {
final
String
entityTypeName
=
GraphHelper
.
getTypeName
(
instanceVertex
);
String
relationshipLabel
=
GraphHelper
.
getTraitLabel
(
entityTypeName
,
traitNameToBeDeleted
);
Edge
edge
=
GraphHelper
.
getEdgeForLabel
(
instanceVertex
,
relationshipLabel
);
deleteHandler
.
deleteEdgeReference
(
edge
,
DataTypes
.
TypeCategory
.
TRAIT
,
false
,
true
);
if
(
edge
!=
null
)
{
deleteHandler
.
deleteEdgeReference
(
edge
,
DataTypes
.
TypeCategory
.
TRAIT
,
false
,
true
);
// update the traits in entity once trait removal is successful
traitNames
.
remove
(
traitNameToBeDeleted
);
updateTraits
(
instanceVertex
,
traitNames
);
// update the traits in entity once trait removal is successful
traitNames
.
remove
(
traitNameToBeDeleted
);
updateTraits
(
instanceVertex
,
traitNames
);
}
}
catch
(
Exception
e
)
{
throw
new
RepositoryException
(
e
);
}
...
...
repository/src/main/java/org/apache/atlas/repository/graph/GraphHelper.java
View file @
e15629c2
...
...
@@ -429,20 +429,28 @@ public final class GraphHelper {
}
public
static
String
string
(
Vertex
vertex
)
{
if
(
LOG
.
isDebugEnabled
())
{
return
String
.
format
(
"vertex[id=%s type=%s guid=%s]"
,
vertex
.
getId
().
toString
(),
getTypeName
(
vertex
),
getIdFromVertex
(
vertex
));
if
(
vertex
==
null
)
{
return
"vertex[null]"
;
}
else
{
return
String
.
format
(
"vertex[id=%s]"
,
vertex
.
getId
().
toString
());
if
(
LOG
.
isDebugEnabled
())
{
return
String
.
format
(
"vertex[id=%s type=%s guid=%s]"
,
vertex
.
getId
().
toString
(),
getTypeName
(
vertex
),
getIdFromVertex
(
vertex
));
}
else
{
return
String
.
format
(
"vertex[id=%s]"
,
vertex
.
getId
().
toString
());
}
}
}
public
static
String
string
(
Edge
edge
)
{
if
(
LOG
.
isDebugEnabled
())
{
return
String
.
format
(
"edge[id=%s label=%s from %s -> to %s]"
,
edge
.
getId
().
toString
(),
edge
.
getLabel
(),
string
(
edge
.
getVertex
(
Direction
.
OUT
)),
string
(
edge
.
getVertex
(
Direction
.
IN
)));
if
(
edge
==
null
)
{
return
"edge[null]"
;
}
else
{
return
String
.
format
(
"edge[id=%s]"
,
edge
.
getId
().
toString
());
if
(
LOG
.
isDebugEnabled
())
{
return
String
.
format
(
"edge[id=%s label=%s from %s -> to %s]"
,
edge
.
getId
().
toString
(),
edge
.
getLabel
(),
string
(
edge
.
getVertex
(
Direction
.
OUT
)),
string
(
edge
.
getVertex
(
Direction
.
IN
)));
}
else
{
return
String
.
format
(
"edge[id=%s]"
,
edge
.
getId
().
toString
());
}
}
}
...
...
repository/src/main/java/org/apache/atlas/repository/graph/TypedInstanceToGraphMapper.java
View file @
e15629c2
...
...
@@ -370,10 +370,12 @@ public final class TypedInstanceToGraphMapper {
if
(!
cloneElements
.
isEmpty
())
{
for
(
String
edgeIdForDelete
:
cloneElements
)
{
Edge
edge
=
graphHelper
.
getEdgeByEdgeId
(
instanceVertex
,
edgeLabel
,
edgeIdForDelete
);
boolean
deleted
=
deleteHandler
.
deleteEdgeReference
(
edge
,
entryType
.
getTypeCategory
(),
attributeInfo
.
isComposite
,
true
);
if
(!
deleted
)
{
additionalElements
.
add
(
edgeIdForDelete
);
if
(
edge
!=
null
)
{
boolean
deleted
=
deleteHandler
.
deleteEdgeReference
(
edge
,
entryType
.
getTypeCategory
(),
attributeInfo
.
isComposite
,
true
);
if
(!
deleted
)
{
additionalElements
.
add
(
edgeIdForDelete
);
}
}
}
}
...
...
@@ -454,11 +456,13 @@ public final class TypedInstanceToGraphMapper {
if
(!
newMap
.
values
().
contains
(
currentEdge
))
{
String
edgeLabel
=
GraphHelper
.
getQualifiedNameForMapKey
(
propertyName
,
currentKey
);
Edge
edge
=
graphHelper
.
getEdgeByEdgeId
(
instanceVertex
,
edgeLabel
,
currentMap
.
get
(
currentKey
));
boolean
deleted
=
deleteHandler
.
deleteEdgeReference
(
edge
,
elementType
.
getTypeCategory
(),
attributeInfo
.
isComposite
,
true
);
if
(!
deleted
)
{
additionalMap
.
put
(
currentKey
,
currentEdge
);
shouldDeleteKey
=
false
;
if
(
edge
!=
null
)
{
boolean
deleted
=
deleteHandler
.
deleteEdgeReference
(
edge
,
elementType
.
getTypeCategory
(),
attributeInfo
.
isComposite
,
true
);
if
(!
deleted
)
{
additionalMap
.
put
(
currentKey
,
currentEdge
);
shouldDeleteKey
=
false
;
}
}
}
}
...
...
@@ -702,7 +706,9 @@ public final class TypedInstanceToGraphMapper {
}
else
if
(
attributeInfo
.
dataType
()
==
DataTypes
.
DATE_TYPE
)
{
final
Date
dateVal
=
typedInstance
.
getDate
(
attributeInfo
.
name
);
//Convert Property value to Long while persisting
propertyValue
=
dateVal
.
getTime
();
if
(
dateVal
!=
null
)
{
propertyValue
=
dateVal
.
getTime
();
}
}
else
if
(
attributeInfo
.
dataType
().
getTypeCategory
()
==
DataTypes
.
TypeCategory
.
ENUM
)
{
if
(
attrValue
!=
null
)
{
propertyValue
=
((
EnumValue
)
attrValue
).
value
;
...
...
repository/src/main/java/org/apache/atlas/repository/typestore/GraphBackedTypeStore.java
View file @
e15629c2
...
...
@@ -168,16 +168,20 @@ public class GraphBackedTypeStore implements ITypeStore {
switch
(
attrDataType
.
getTypeCategory
())
{
case
ARRAY:
String
attrType
=
TypeUtils
.
parseAsArrayType
(
attrDataType
.
getName
());
IDataType
elementType
=
typeSystem
.
getDataType
(
IDataType
.
class
,
attrType
);
attrDataTypes
.
add
(
elementType
);
if
(
attrType
!=
null
)
{
IDataType
elementType
=
typeSystem
.
getDataType
(
IDataType
.
class
,
attrType
);
attrDataTypes
.
add
(
elementType
);
}
break
;
case
MAP:
String
[]
attrTypes
=
TypeUtils
.
parseAsMapType
(
attrDataType
.
getName
());
IDataType
keyType
=
typeSystem
.
getDataType
(
IDataType
.
class
,
attrTypes
[
0
]);
IDataType
valueType
=
typeSystem
.
getDataType
(
IDataType
.
class
,
attrTypes
[
1
]);
attrDataTypes
.
add
(
keyType
);
attrDataTypes
.
add
(
valueType
);
if
(
attrTypes
!=
null
&&
attrTypes
.
length
>
1
)
{
IDataType
keyType
=
typeSystem
.
getDataType
(
IDataType
.
class
,
attrTypes
[
0
]);
IDataType
valueType
=
typeSystem
.
getDataType
(
IDataType
.
class
,
attrTypes
[
1
]);
attrDataTypes
.
add
(
keyType
);
attrDataTypes
.
add
(
valueType
);
}
break
;
case
ENUM:
...
...
repository/src/main/java/org/apache/atlas/services/DefaultMetadataService.java
View file @
e15629c2
...
...
@@ -239,7 +239,7 @@ public class DefaultMetadataService implements MetadataService, ActiveStateChang
}
private
JSONObject
createOrUpdateTypes
(
String
typeDefinition
,
boolean
isUpdate
)
throws
AtlasException
{
ParamChecker
.
notEmpty
(
typeDefinition
,
"type definition"
);
typeDefinition
=
ParamChecker
.
notEmpty
(
typeDefinition
,
"type definition"
);
TypesDef
typesDef
=
validateTypeDefinition
(
typeDefinition
);
try
{
...
...
@@ -327,7 +327,7 @@ public class DefaultMetadataService implements MetadataService, ActiveStateChang
*/
@Override
public
List
<
String
>
createEntities
(
String
entityInstanceDefinition
)
throws
AtlasException
{
ParamChecker
.
notEmpty
(
entityInstanceDefinition
,
"Entity instance definition"
);
entityInstanceDefinition
=
ParamChecker
.
notEmpty
(
entityInstanceDefinition
,
"Entity instance definition"
);
ITypedReferenceableInstance
[]
typedInstances
=
deserializeClassInstances
(
entityInstanceDefinition
);
...
...
@@ -362,8 +362,7 @@ public class DefaultMetadataService implements MetadataService, ActiveStateChang
@Override
public
ITypedReferenceableInstance
getTypedReferenceableInstance
(
Referenceable
entityInstance
)
throws
AtlasException
{
final
String
entityTypeName
=
entityInstance
.
getTypeName
();
ParamChecker
.
notEmpty
(
entityTypeName
,
"Entity type cannot be null"
);
final
String
entityTypeName
=
ParamChecker
.
notEmpty
(
entityInstance
.
getTypeName
(),
"Entity type cannot be null"
);
ClassType
entityType
=
typeSystem
.
getDataType
(
ClassType
.
class
,
entityTypeName
);
...
...
@@ -385,7 +384,7 @@ public class DefaultMetadataService implements MetadataService, ActiveStateChang
*/
@Override
public
String
getEntityDefinition
(
String
guid
)
throws
AtlasException
{
ParamChecker
.
notEmpty
(
guid
,
"entity id"
);
guid
=
ParamChecker
.
notEmpty
(
guid
,
"entity id"
);
final
ITypedReferenceableInstance
instance
=
repository
.
getEntityDefinition
(
guid
);
return
InstanceSerialization
.
toJson
(
instance
,
true
);
...
...
@@ -440,8 +439,7 @@ public class DefaultMetadataService implements MetadataService, ActiveStateChang
*/
@Override
public
AtlasClient
.
EntityResult
updateEntities
(
String
entityInstanceDefinition
)
throws
AtlasException
{
ParamChecker
.
notEmpty
(
entityInstanceDefinition
,
"Entity instance definition"
);
entityInstanceDefinition
=
ParamChecker
.
notEmpty
(
entityInstanceDefinition
,
"Entity instance definition"
);
ITypedReferenceableInstance
[]
typedInstances
=
deserializeClassInstances
(
entityInstanceDefinition
);
AtlasClient
.
EntityResult
entityResult
=
repository
.
updateEntities
(
typedInstances
);
...
...
@@ -457,11 +455,11 @@ public class DefaultMetadataService implements MetadataService, ActiveStateChang
}
@Override
public
AtlasClient
.
EntityResult
updateEntityAttributeByGuid
(
final
String
guid
,
String
attributeName
,
public
AtlasClient
.
EntityResult
updateEntityAttributeByGuid
(
String
guid
,
String
attributeName
,
String
value
)
throws
AtlasException
{
ParamChecker
.
notEmpty
(
guid
,
"entity id"
);
ParamChecker
.
notEmpty
(
attributeName
,
"attribute name"
);
ParamChecker
.
notEmpty
(
value
,
"attribute value"
);
guid
=
ParamChecker
.
notEmpty
(
guid
,
"entity id"
);
attributeName
=
ParamChecker
.
notEmpty
(
attributeName
,
"attribute name"
);
value
=
ParamChecker
.
notEmpty
(
value
,
"attribute value"
);
ITypedReferenceableInstance
existInstance
=
validateEntityExists
(
guid
);
ClassType
type
=
typeSystem
.
getDataType
(
ClassType
.
class
,
existInstance
.
getTypeName
());
...
...
@@ -502,10 +500,10 @@ public class DefaultMetadataService implements MetadataService, ActiveStateChang
}
@Override
public
AtlasClient
.
EntityResult
updateEntityPartialByGuid
(
final
String
guid
,
Referenceable
newEntity
)
public
AtlasClient
.
EntityResult
updateEntityPartialByGuid
(
String
guid
,
Referenceable
newEntity
)
throws
AtlasException
{
ParamChecker
.
notEmpty
(
guid
,
"guid cannot be null"
);
ParamChecker
.
notNull
(
newEntity
,
"updatedEntity cannot be null"
);
guid
=
ParamChecker
.
notEmpty
(
guid
,
"guid cannot be null"
);
newEntity
=
ParamChecker
.
notNull
(
newEntity
,
"updatedEntity cannot be null"
);
ITypedReferenceableInstance
existInstance
=
validateEntityExists
(
guid
);
ITypedReferenceableInstance
newInstance
=
convertToTypedInstance
(
newEntity
,
existInstance
.
getTypeName
());
...
...
@@ -563,10 +561,10 @@ public class DefaultMetadataService implements MetadataService, ActiveStateChang
public
AtlasClient
.
EntityResult
updateEntityByUniqueAttribute
(
String
typeName
,
String
uniqueAttributeName
,
String
attrValue
,
Referenceable
updatedEntity
)
throws
AtlasException
{
ParamChecker
.
notEmpty
(
typeName
,
"typeName"
);
ParamChecker
.
notEmpty
(
uniqueAttributeName
,
"uniqueAttributeName"
);
ParamChecker
.
notNull
(
attrValue
,
"unique attribute value"
);
ParamChecker
.
notNull
(
updatedEntity
,
"updatedEntity"
);
typeName
=
ParamChecker
.
notEmpty
(
typeName
,
"typeName"
);
uniqueAttributeName
=
ParamChecker
.
notEmpty
(
uniqueAttributeName
,
"uniqueAttributeName"
);
attrValue
=
ParamChecker
.
notNull
(
attrValue
,
"unique attribute value"
);
updatedEntity
=
ParamChecker
.
notNull
(
updatedEntity
,
"updatedEntity"
);
ITypedReferenceableInstance
oldInstance
=
getEntityDefinitionReference
(
typeName
,
uniqueAttributeName
,
attrValue
);
...
...
@@ -579,7 +577,7 @@ public class DefaultMetadataService implements MetadataService, ActiveStateChang
}
private
void
validateTypeExists
(
String
entityType
)
throws
AtlasException
{
ParamChecker
.
notEmpty
(
entityType
,
"entity type"
);
entityType
=
ParamChecker
.
notEmpty
(
entityType
,
"entity type"
);
IDataType
type
=
typeSystem
.
getDataType
(
IDataType
.
class
,
entityType
);
if
(
type
.
getTypeCategory
()
!=
DataTypes
.
TypeCategory
.
CLASS
)
{
...
...
@@ -596,7 +594,7 @@ public class DefaultMetadataService implements MetadataService, ActiveStateChang
*/
@Override
public
List
<
String
>
getTraitNames
(
String
guid
)
throws
AtlasException
{
ParamChecker
.
notEmpty
(
guid
,
"entity id"
);
guid
=
ParamChecker
.
notEmpty
(
guid
,
"entity id"
);
return
repository
.
getTraitNames
(
guid
);
}
...
...
@@ -609,8 +607,8 @@ public class DefaultMetadataService implements MetadataService, ActiveStateChang
*/
@Override
public
void
addTrait
(
String
guid
,
String
traitInstanceDefinition
)
throws
AtlasException
{
ParamChecker
.
notEmpty
(
guid
,
"entity id"
);
ParamChecker
.
notEmpty
(
traitInstanceDefinition
,
"trait instance definition"
);
guid
=
ParamChecker
.
notEmpty
(
guid
,
"entity id"
);
traitInstanceDefinition
=
ParamChecker
.
notEmpty
(
traitInstanceDefinition
,
"trait instance definition"
);
ITypedStruct
traitInstance
=
deserializeTraitInstance
(
traitInstanceDefinition
);
addTrait
(
guid
,
traitInstance
);
...
...
@@ -644,8 +642,7 @@ public class DefaultMetadataService implements MetadataService, ActiveStateChang
@Override
public
ITypedStruct
createTraitInstance
(
Struct
traitInstance
)
throws
AtlasException
{
try
{
final
String
entityTypeName
=
traitInstance
.
getTypeName
();
ParamChecker
.
notEmpty
(
entityTypeName
,
"entity type"
);
final
String
entityTypeName
=
ParamChecker
.
notEmpty
(
traitInstance
.
getTypeName
(),
"entity type"
);
TraitType
traitType
=
typeSystem
.
getDataType
(
TraitType
.
class
,
entityTypeName
);
return
traitType
.
convert
(
traitInstance
,
Multiplicity
.
REQUIRED
);
...
...
@@ -657,8 +654,8 @@ public class DefaultMetadataService implements MetadataService, ActiveStateChang
}
@Override
public
String
getTraitDefinition
(
final
String
guid
,
final
String
traitName
)
throws
AtlasException
{
ParamChecker
.
notEmpty
(
guid
,
"entity id"
);
public
String
getTraitDefinition
(
String
guid
,
final
String
traitName
)
throws
AtlasException
{
guid
=
ParamChecker
.
notEmpty
(
guid
,
"entity id"
);
final
ITypedReferenceableInstance
instance
=
repository
.
getEntityDefinition
(
guid
);
IStruct
struct
=
instance
.
getTrait
(
traitName
);
...
...
@@ -674,8 +671,8 @@ public class DefaultMetadataService implements MetadataService, ActiveStateChang
*/
@Override
public
void
deleteTrait
(
String
guid
,
String
traitNameToBeDeleted
)
throws
AtlasException
{
ParamChecker
.
notEmpty
(
guid
,
"entity id"
);
ParamChecker
.
notEmpty
(
traitNameToBeDeleted
,
"trait name"
);
guid
=
ParamChecker
.
notEmpty
(
guid
,
"entity id"
);
traitNameToBeDeleted
=
ParamChecker
.
notEmpty
(
traitNameToBeDeleted
,
"trait name"
);
// ensure trait type is already registered with the TS
if
(!
typeSystem
.
isRegistered
(
traitNameToBeDeleted
))
{
...
...
@@ -747,8 +744,8 @@ public class DefaultMetadataService implements MetadataService, ActiveStateChang
@Override
public
List
<
EntityAuditEvent
>
getAuditEvents
(
String
guid
,
String
startKey
,
short
count
)
throws
AtlasException
{
ParamChecker
.
notEmpty
(
guid
,
"entity id"
);
ParamChecker
.
notEmptyIfNotNull
(
startKey
,
"start key"
);
guid
=
ParamChecker
.
notEmpty
(
guid
,
"entity id"
);
startKey
=
ParamChecker
.
notEmptyIfNotNull
(
startKey
,
"start key"
);
ParamChecker
.
lessThan
(
count
,
maxAuditResults
,
"count"
);
return
auditRepository
.
listEvents
(
guid
,
startKey
,
count
);
...
...
@@ -766,9 +763,9 @@ public class DefaultMetadataService implements MetadataService, ActiveStateChang
@Override
public
AtlasClient
.
EntityResult
deleteEntityByUniqueAttribute
(
String
typeName
,
String
uniqueAttributeName
,
String
attrValue
)
throws
AtlasException
{
ParamChecker
.
notEmpty
(
typeName
,
"delete candidate typeName"
);
ParamChecker
.
notEmpty
(
uniqueAttributeName
,
"delete candidate unique attribute name"
);
ParamChecker
.
notEmpty
(
attrValue
,
"delete candidate unique attribute value"
);
typeName
=
ParamChecker
.
notEmpty
(
typeName
,
"delete candidate typeName"
);
uniqueAttributeName
=
ParamChecker
.
notEmpty
(
uniqueAttributeName
,
"delete candidate unique attribute name"
);
attrValue
=
ParamChecker
.
notEmpty
(
attrValue
,
"delete candidate unique attribute value"
);
//Throws EntityNotFoundException if the entity could not be found by its unique attribute
ITypedReferenceableInstance
instance
=
getEntityDefinitionReference
(
typeName
,
uniqueAttributeName
,
attrValue
);
...
...
titan/src/main/java/com/thinkaurelius/titan/diskstorage/hbase/HBaseKeyColumnValueStore.java
View file @
e15629c2
...
...
@@ -330,18 +330,19 @@ public class HBaseKeyColumnValueStore implements KeyColumnValueStore {
ensureOpen
();
return
new
RecordIterator
<
Entry
>()
{
private
final
Iterator
<
Map
.
Entry
<
byte
[],
NavigableMap
<
Long
,
byte
[]>>>
kv
=
currentRow
.
getMap
().
get
(
columnFamilyBytes
).
entrySet
().
iterator
();
private
final
NavigableMap
<
byte
[],
NavigableMap
<
byte
[],
NavigableMap
<
Long
,
byte
[]>>>
currentMap
=
currentRow
.
getMap
();
private
final
Iterator
<
Map
.
Entry
<
byte
[],
NavigableMap
<
Long
,
byte
[]>>>
kv
=
currentMap
==
null
?
null
:
currentMap
.
get
(
columnFamilyBytes
).
entrySet
().
iterator
();
@Override
public
boolean
hasNext
()
{
ensureOpen
();
return
kv
.
hasNext
();
return
kv
==
null
?
false
:
kv
.
hasNext
();
}
@Override
public
Entry
next
()
{
ensureOpen
();
return
StaticArrayEntry
.
ofBytes
(
kv
.
next
(),
entryGetter
);
return
kv
==
null
?
null
:
StaticArrayEntry
.
ofBytes
(
kv
.
next
(),
entryGetter
);
}
@Override
...
...
typesystem/src/main/java/org/apache/atlas/typesystem/persistence/Id.java
View file @
e15629c2
...
...
@@ -45,9 +45,9 @@ public class Id implements ITypedReferenceableInstance {
public
EntityState
state
;
public
Id
(
String
id
,
int
version
,
String
typeName
,
String
state
)
{
ParamChecker
.
notEmpty
(
id
,
"id"
);
ParamChecker
.
notEmpty
(
typeName
,
"typeName"
);
ParamChecker
.
notEmptyIfNotNull
(
state
,
"state"
);
id
=
ParamChecker
.
notEmpty
(
id
,
"id"
);
typeName
=
ParamChecker
.
notEmpty
(
typeName
,
"typeName"
);
state
=
ParamChecker
.
notEmptyIfNotNull
(
state
,
"state"
);
this
.
id
=
id
;
this
.
typeName
=
typeName
;
this
.
version
=
version
;
...
...
typesystem/src/main/java/org/apache/atlas/typesystem/types/AbstractDataType.java
View file @
e15629c2
...
...
@@ -47,12 +47,18 @@ abstract class AbstractDataType<T> implements IDataType<T> {
@Override
public
void
output
(
T
val
,
Appendable
buf
,
String
prefix
,
Set
<
T
>
inProcess
)
throws
AtlasException
{
if
(
val
instanceof
Map
)
{
final
String
strValue
;
if
(
val
==
null
)
{
strValue
=
"<null>"
;
}
else
if
(
val
instanceof
Map
)
{
ImmutableSortedMap
immutableSortedMap
=
ImmutableSortedMap
.
copyOf
((
Map
)
val
);
TypeUtils
.
outputVal
(
val
==
null
?
"<null>"
:
immutableSortedMap
.
toString
(),
buf
,
prefix
);
strValue
=
immutableSortedMap
.
toString
(
);
}
else
{
TypeUtils
.
outputVal
(
val
==
null
?
"<null>"
:
val
.
toString
(),
buf
,
prefix
);
strValue
=
val
.
toString
(
);
}
TypeUtils
.
outputVal
(
strValue
,
buf
,
prefix
);
}
@Override
...
...
typesystem/src/main/java/org/apache/atlas/typesystem/types/HierarchicalType.java
View file @
e15629c2
...
...
@@ -517,17 +517,20 @@ public abstract class HierarchicalType<ST extends HierarchicalType, T> extends A
@Override
public
Path
next
()
{
Path
p
=
pathQueue
.
poll
();
ST
t
=
null
;
try
{
t
=
(
ST
)
typeSystem
.
getDataType
(
superTypeClass
,
p
.
typeName
);
}
catch
(
AtlasException
me
)
{
throw
new
RuntimeException
(
me
);
}
if
(
t
.
superTypes
!=
null
)
{
ImmutableSet
<
String
>
sTs
=
t
.
superTypes
;
for
(
String
sT
:
sTs
)
{
String
nm
=
sT
+
"."
+
p
.
pathName
;
pathQueue
.
add
(
pathNameToPathMap
.
get
(
nm
));
if
(
p
!=
null
)
{
ST
t
=
null
;
try
{
t
=
(
ST
)
typeSystem
.
getDataType
(
superTypeClass
,
p
.
typeName
);
}
catch
(
AtlasException
me
)
{
throw
new
RuntimeException
(
me
);
}
if
(
t
.
superTypes
!=
null
)
{
ImmutableSet
<
String
>
sTs
=
t
.
superTypes
;
for
(
String
sT
:
sTs
)
{
String
nm
=
sT
+
"."
+
p
.
pathName
;
pathQueue
.
add
(
pathNameToPathMap
.
get
(
nm
));
}
}
}
return
p
;
...
...
typesystem/src/main/java/org/apache/atlas/typesystem/types/ObjectGraphTraversal.java
View file @
e15629c2
...
...
@@ -172,7 +172,9 @@ public class ObjectGraphTraversal implements Iterator<ObjectGraphTraversal.Insta
public
InstanceTuple
next
()
{
try
{
InstanceTuple
t
=
queue
.
poll
();
processReferenceableInstance
(
t
.
instance
);
if
(
t
!=
null
)
{
processReferenceableInstance
(
t
.
instance
);
}
return
t
;
}
catch
(
AtlasException
me
)
{
throw
new
RuntimeException
(
me
);
...
...
typesystem/src/main/java/org/apache/atlas/typesystem/types/ObjectGraphWalker.java
View file @
e15629c2
...
...
@@ -76,7 +76,9 @@ public class ObjectGraphWalker {
public
void
walk
()
throws
AtlasException
{
while
(!
queue
.
isEmpty
())
{
IReferenceableInstance
r
=
queue
.
poll
();
processReferenceableInstance
(
r
);
if
(
r
!=
null
)
{
processReferenceableInstance
(
r
);
}
}
}
...
...
webapp/src/main/java/org/apache/atlas/Atlas.java
View file @
e15629c2
...
...
@@ -147,9 +147,12 @@ public final class Atlas {
}
static
int
getApplicationPort
(
CommandLine
cmd
,
String
enableTLSFlag
,
Configuration
configuration
)
{
String
optionValue
=
cmd
.
hasOption
(
APP_PORT
)
?
cmd
.
getOptionValue
(
APP_PORT
)
:
null
;
final
int
appPort
;
if
(
cmd
.
hasOption
(
APP_PORT
))
{
appPort
=
Integer
.
valueOf
(
cmd
.
getOptionValue
(
APP_PORT
));
if
(
StringUtils
.
isNotEmpty
(
optionValue
))
{
appPort
=
Integer
.
valueOf
(
optionValue
);
}
else
{
// default : atlas.enableTLS is true
appPort
=
getPortValue
(
configuration
,
enableTLSFlag
);
...
...
webapp/src/main/java/org/apache/atlas/examples/QuickStart.java
View file @
e15629c2
...
...
@@ -466,7 +466,7 @@ public class QuickStart {
if
(
results
!=
null
)
{
System
.
out
.
println
(
"query ["
+
dslQuery
+
"] returned ["
+
results
.
length
()
+
"] rows"
);
}
else
{
System
.
out
.
println
(
"query ["
+
dslQuery
+
"] failed, results:"
+
results
.
toString
()
);
System
.
out
.
println
(
"query ["
+
dslQuery
+
"] failed, results:"
+
results
);
}
}
}
...
...
webapp/src/main/java/org/apache/atlas/util/CredentialProviderUtility.java
View file @
e15629c2
...
...
@@ -74,26 +74,28 @@ public class CredentialProviderUtility {
// prompt for the provider name
CredentialProvider
provider
=
getCredentialProvider
(
textDevice
);
char
[]
cred
;
for
(
String
key
:
KEYS
)
{
cred
=
getPassword
(
textDevice
,
key
);
// create a credential entry and store it
boolean
overwrite
=
true
;
if
(
provider
.
getCredentialEntry
(
key
)
!=
null
)
{
String
choice
=
textDevice
.
readLine
(
"Entry for %s already exists. Overwrite? (y/n) [y]:"
,
key
);
overwrite
=
StringUtils
.
isEmpty
(
choice
)
||
choice
.
equalsIgnoreCase
(
"y"
);
if
(
overwrite
)
{
provider
.
deleteCredentialEntry
(
key
);
provider
.
flush
();
if
(
provider
!=
null
)
{
char
[]
cred
;
for
(
String
key
:
KEYS
)
{
cred
=
getPassword
(
textDevice
,
key
);
// create a credential entry and store it
boolean
overwrite
=
true
;
if
(
provider
.
getCredentialEntry
(
key
)
!=
null
)
{
String
choice
=
textDevice
.
readLine
(
"Entry for %s already exists. Overwrite? (y/n) [y]:"
,
key
);
overwrite
=
StringUtils
.
isEmpty
(
choice
)
||
choice
.
equalsIgnoreCase
(
"y"
);
if
(
overwrite
)
{
provider
.
deleteCredentialEntry
(
key
);
provider
.
flush
();
provider
.
createCredentialEntry
(
key
,
cred
);
provider
.
flush
();
textDevice
.
printf
(
"Entry for %s was overwritten with the new value.\n"
,
key
);
}
else
{
textDevice
.
printf
(
"Entry for %s was not overwritten.\n"
,
key
);
}
}
else
{
provider
.
createCredentialEntry
(
key
,
cred
);
provider
.
flush
();
textDevice
.
printf
(
"Entry for %s was overwritten with the new value.\n"
,
key
);
}
else
{
textDevice
.
printf
(
"Entry for %s was not overwritten.\n"
,
key
);
}
}
else
{
provider
.
createCredentialEntry
(
key
,
cred
);
provider
.
flush
();
}
}
}
...
...
@@ -141,16 +143,21 @@ public class CredentialProviderUtility {
*/
private
static
CredentialProvider
getCredentialProvider
(
TextDevice
textDevice
)
throws
IOException
{
String
providerPath
=
textDevice
.
readLine
(
"Please enter the full path to the credential provider:"
);
File
file
=
new
File
(
providerPath
);
if
(
file
.
exists
())
{
textDevice
.
printf
(
"%s already exists. You will need to specify whether existing entries should be "
+
"overwritten "
+
"(default is 'yes')\n"
,
providerPath
);
if
(
providerPath
!=
null
)
{
File
file
=
new
File
(
providerPath
);
if
(
file
.
exists
())
{
textDevice
.
printf
(
"%s already exists. You will need to specify whether existing entries should be "
+
"overwritten "
+
"(default is 'yes')\n"
,
providerPath
);
}
String
providerURI
=
JavaKeyStoreProvider
.
SCHEME_NAME
+
"://file/"
+
providerPath
;
Configuration
conf
=
new
Configuration
(
false
);
conf
.
set
(
CredentialProviderFactory
.
CREDENTIAL_PROVIDER_PATH
,
providerURI
);
return
CredentialProviderFactory
.
getProviders
(
conf
).
get
(
0
);
}
String
providerURI
=
JavaKeyStoreProvider
.
SCHEME_NAME
+
"://file/"
+
providerPath
;
Configuration
conf
=
new
Configuration
(
false
);
conf
.
set
(
CredentialProviderFactory
.
CREDENTIAL_PROVIDER_PATH
,
providerURI
);
return
CredentialProviderFactory
.
getProviders
(
conf
).
get
(
0
);
return
null
;
}
}
webapp/src/main/java/org/apache/atlas/web/dao/UserDao.java
View file @
e15629c2
...
...
@@ -19,6 +19,7 @@ package org.apache.atlas.web.dao;
import
com.google.common.annotations.VisibleForTesting
;
import
java.io.FileInputStream
;
import
java.io.InputStream
;
import
java.io.IOException
;
import
java.util.ArrayList
;
import
java.util.Properties
;
...
...
@@ -54,6 +55,8 @@ public class UserDao {
void
loadFileLoginsDetails
()
{
String
PROPERTY_FILE_PATH
=
null
;
InputStream
inStr
=
null
;
try
{
Configuration
configuration
=
ApplicationProperties
.
get
();
...
...
@@ -61,7 +64,8 @@ public class UserDao {
.
getString
(
"atlas.authentication.method.file.filename"
);
if
(
PROPERTY_FILE_PATH
!=
null
&&
!
""
.
equals
(
PROPERTY_FILE_PATH
))
{
userLogins
=
new
Properties
();
userLogins
.
load
(
new
FileInputStream
(
PROPERTY_FILE_PATH
));
inStr
=
new
FileInputStream
(
PROPERTY_FILE_PATH
);
userLogins
.
load
(
inStr
);
}
else
{
LOG
.
error
(
"Error while reading user.properties file, filepath="
+
PROPERTY_FILE_PATH
);
...
...
@@ -70,6 +74,14 @@ public class UserDao {
}
catch
(
IOException
|
AtlasException
e
)
{
LOG
.
error
(
"Error while reading user.properties file, filepath="
+
PROPERTY_FILE_PATH
,
e
);
}
finally
{
if
(
inStr
!=
null
)
{
try
{
inStr
.
close
();
}
catch
(
Exception
excp
)
{
// ignore
}
}
}
}
...
...
webapp/src/main/java/org/apache/atlas/web/resources/EntityResource.java
View file @
e15629c2
...
...
@@ -339,7 +339,7 @@ public class EntityResource {
private
Response
updateEntityPartialByGuid
(
String
guid
,
HttpServletRequest
request
)
{
String
entityJson
=
null
;
try
{
ParamChecker
.
notEmpty
(
guid
,
"Guid property cannot be null"
);
guid
=
ParamChecker
.
notEmpty
(
guid
,
"Guid property cannot be null"
);
entityJson
=
Servlets
.
getRequestPayload
(
request
);
LOG
.
info
(
"partially updating entity for guid {} : {} "
,
guid
,
entityJson
);
...
...
@@ -468,7 +468,7 @@ public class EntityResource {
}
LOG
.
debug
(
"Fetching entity definition for guid={} "
,
guid
);
ParamChecker
.
notEmpty
(
guid
,
"guid cannot be null"
);
guid
=
ParamChecker
.
notEmpty
(
guid
,
"guid cannot be null"
);
final
String
entityDefinition
=
metadataService
.
getEntityDefinition
(
guid
);
JSONObject
response
=
new
JSONObject
();
...
...
@@ -564,9 +564,9 @@ public class EntityResource {
public
Response
getEntityDefinitionByAttribute
(
String
entityType
,
String
attribute
,
String
value
)
{
try
{
LOG
.
debug
(
"Fetching entity definition for type={}, qualified name={}"
,
entityType
,
value
);
ParamChecker
.
notEmpty
(
entityType
,
"Entity type cannot be null"
);
ParamChecker
.
notEmpty
(
attribute
,
"attribute name cannot be null"
);
ParamChecker
.
notEmpty
(
value
,
"attribute value cannot be null"
);
entityType
=
ParamChecker
.
notEmpty
(
entityType
,
"Entity type cannot be null"
);
attribute
=
ParamChecker
.
notEmpty
(
attribute
,
"attribute name cannot be null"
);
value
=
ParamChecker
.
notEmpty
(
value
,
"attribute value cannot be null"
);
final
String
entityDefinition
=
metadataService
.
getEntityDefinition
(
entityType
,
attribute
,
value
);
...
...
webapp/src/main/java/org/apache/atlas/web/resources/MetadataDiscoveryResource.java
View file @
e15629c2
...
...
@@ -91,7 +91,7 @@ public class MetadataDiscoveryResource {
@DefaultValue
(
LIMIT_OFFSET_DEFAULT
)
@QueryParam
(
"offset"
)
int
offset
)
{
AtlasPerfTracer
perf
=
null
;
if
(
AtlasPerfTracer
.
isPerfTraceEnabled
(
PERF_LOG
))
{
perf
=
AtlasPerfTracer
.
getPerfTracer
(
PERF_LOG
,
"MetadataDiscoveryResource.search("
+
query
+
")"
);
perf
=
AtlasPerfTracer
.
getPerfTracer
(
PERF_LOG
,
"MetadataDiscoveryResource.search("
+
query
+
"
, "
+
limit
+
", "
+
offset
+
"
)"
);
}
Response
response
=
searchUsingQueryDSL
(
query
,
limit
,
offset
);
if
(
response
.
getStatus
()
!=
Response
.
Status
.
OK
.
getStatusCode
())
{
...
...
@@ -123,10 +123,10 @@ public class MetadataDiscoveryResource {
AtlasPerfTracer
perf
=
null
;
try
{
if
(
AtlasPerfTracer
.
isPerfTraceEnabled
(
PERF_LOG
))
{
perf
=
AtlasPerfTracer
.
getPerfTracer
(
PERF_LOG
,
"MetadataDiscoveryResource.searchUsingQueryDSL("
+
dslQuery
+
")"
);
perf
=
AtlasPerfTracer
.
getPerfTracer
(
PERF_LOG
,
"MetadataDiscoveryResource.searchUsingQueryDSL("
+
dslQuery
+
"
, "
+
limit
+
", "
+
offset
+
"
)"
);
}
ParamChecker
.
notEmpty
(
dslQuery
,
"dslQuery cannot be null"
);
dslQuery
=
ParamChecker
.
notEmpty
(
dslQuery
,
"dslQuery cannot be null"
);
QueryParams
queryParams
=
validateQueryParams
(
limit
,
offset
);
final
String
jsonResultStr
=
discoveryService
.
searchByDSL
(
dslQuery
,
queryParams
);
...
...
@@ -184,7 +184,7 @@ public class MetadataDiscoveryResource {
perf
=
AtlasPerfTracer
.
getPerfTracer
(
PERF_LOG
,
"MetadataDiscoveryResource.searchUsingGremlinQuery("
+
gremlinQuery
+
")"
);
}
ParamChecker
.
notEmpty
(
gremlinQuery
,
"gremlinQuery cannot be null or empty"
);
gremlinQuery
=
ParamChecker
.
notEmpty
(
gremlinQuery
,
"gremlinQuery cannot be null or empty"
);
final
List
<
Map
<
String
,
String
>>
results
=
discoveryService
.
searchByGremlin
(
gremlinQuery
);
JSONObject
response
=
new
JSONObject
();
...
...
@@ -230,10 +230,10 @@ public class MetadataDiscoveryResource {
AtlasPerfTracer
perf
=
null
;
try
{
if
(
AtlasPerfTracer
.
isPerfTraceEnabled
(
PERF_LOG
))
{
perf
=
AtlasPerfTracer
.
getPerfTracer
(
PERF_LOG
,
"MetadataDiscoveryResource.searchUsingFullText("
+
query
+
")"
);
perf
=
AtlasPerfTracer
.
getPerfTracer
(
PERF_LOG
,
"MetadataDiscoveryResource.searchUsingFullText("
+
query
+
"
, "
+
limit
+
", "
+
offset
+
"
)"
);
}
ParamChecker
.
notEmpty
(
query
,
"query cannot be null or empty"
);
query
=
ParamChecker
.
notEmpty
(
query
,
"query cannot be null or empty"
);
QueryParams
queryParams
=
validateQueryParams
(
limit
,
offset
);
final
String
jsonResultStr
=
discoveryService
.
searchByFullText
(
query
,
queryParams
);
JSONArray
rowsJsonArr
=
new
JSONArray
(
jsonResultStr
);
...
...
webapp/src/main/java/org/apache/atlas/web/security/AtlasAuthenticationProvider.java
View file @
e15629c2
...
...
@@ -69,35 +69,32 @@ public class AtlasAuthenticationProvider extends
if
(
ldapType
.
equalsIgnoreCase
(
"LDAP"
))
{
try
{
authentication
=
ldapAuthenticationProvider
.
authenticate
(
authentication
);
authentication
=
ldapAuthenticationProvider
.
authenticate
(
authentication
);
}
catch
(
Exception
ex
)
{
LOG
.
error
(
"Error while LDAP authentication"
,
ex
);
}
}
else
if
(
ldapType
.
equalsIgnoreCase
(
"AD"
))
{
try
{
authentication
=
adAuthenticationProvider
.
authenticate
(
authentication
);
authentication
=
adAuthenticationProvider
.
authenticate
(
authentication
);
}
catch
(
Exception
ex
)
{
LOG
.
error
(
"Error while AD authentication"
,
ex
);
}
}
if
(
authentication
!=
null
&&
authentication
.
isAuthenticated
())
{
return
authentication
;
}
else
{
// If the LDAP/AD authentication fails try the local filebased login method
if
(
fileAuthenticationMethodEnabled
)
{
authentication
=
fileAuthenticationProvider
.
authenticate
(
authentication
);
}
if
(
authentication
!=
null
&&
authentication
.
isAuthenticated
())
{
if
(
authentication
!=
null
)
{
if
(
authentication
.
isAuthenticated
())
{
return
authentication
;
}
else
{
LOG
.
error
(
"Authentication failed."
);
throw
new
AtlasAuthenticationException
(
"Authentication failed."
);
}
else
if
(
fileAuthenticationMethodEnabled
)
{
// If the LDAP/AD authentication fails try the local filebased login method
authentication
=
fileAuthenticationProvider
.
authenticate
(
authentication
);
if
(
authentication
!=
null
&&
authentication
.
isAuthenticated
())
{
return
authentication
;
}
}
}
LOG
.
error
(
"Authentication failed."
);
throw
new
AtlasAuthenticationException
(
"Authentication failed."
);
}
}
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment