Skip to content
Projects
Groups
Snippets
Help
This project
Loading...
Sign in / Register
Toggle navigation
A
atlas
Project
Overview
Details
Activity
Cycle Analytics
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Charts
Issues
0
Issues
0
List
Board
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Charts
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Charts
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
dataplatform
atlas
Commits
e15629c2
Commit
e15629c2
authored
Jul 19, 2016
by
Madhan Neethiraj
Committed by
Suma Shivaprasad
Jul 20, 2016
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
ATLAS-1033: fix for issues flagged by Coverity scan
parent
b7f5995a
Expand all
Hide whitespace changes
Inline
Side-by-side
Showing
30 changed files
with
326 additions
and
254 deletions
+326
-254
FalconBridge.java
...ain/java/org/apache/atlas/falcon/bridge/FalconBridge.java
+20
-11
HiveHook.java
...ge/src/main/java/org/apache/atlas/hive/hook/HiveHook.java
+47
-33
PolicyParser.java
.../java/org/apache/atlas/authorize/simple/PolicyParser.java
+2
-11
SimpleAtlasAuthorizer.java
.../apache/atlas/authorize/simple/SimpleAtlasAuthorizer.java
+7
-4
AtlasClient.java
client/src/main/java/org/apache/atlas/AtlasClient.java
+14
-4
AtlasServiceException.java
...src/main/java/org/apache/atlas/AtlasServiceException.java
+2
-2
ParamChecker.java
...on/src/main/java/org/apache/atlas/utils/ParamChecker.java
+0
-14
FailedMessagesLogger.java
...main/java/org/apache/atlas/hook/FailedMessagesLogger.java
+9
-7
release-log.txt
release-log.txt
+1
-0
DataSetLineageService.java
...ava/org/apache/atlas/discovery/DataSetLineageService.java
+6
-6
GraphBackedDiscoveryService.java
...he/atlas/discovery/graph/GraphBackedDiscoveryService.java
+5
-0
DeleteHandler.java
...java/org/apache/atlas/repository/graph/DeleteHandler.java
+22
-21
GraphBackedMetadataRepository.java
...atlas/repository/graph/GraphBackedMetadataRepository.java
+6
-4
GraphHelper.java
...n/java/org/apache/atlas/repository/graph/GraphHelper.java
+16
-8
TypedInstanceToGraphMapper.java
...he/atlas/repository/graph/TypedInstanceToGraphMapper.java
+16
-10
GraphBackedTypeStore.java
...ache/atlas/repository/typestore/GraphBackedTypeStore.java
+10
-6
DefaultMetadataService.java
...ava/org/apache/atlas/services/DefaultMetadataService.java
+30
-33
HBaseKeyColumnValueStore.java
...ius/titan/diskstorage/hbase/HBaseKeyColumnValueStore.java
+4
-3
Id.java
...main/java/org/apache/atlas/typesystem/persistence/Id.java
+3
-3
AbstractDataType.java
...a/org/apache/atlas/typesystem/types/AbstractDataType.java
+9
-3
HierarchicalType.java
...a/org/apache/atlas/typesystem/types/HierarchicalType.java
+14
-11
ObjectGraphTraversal.java
...g/apache/atlas/typesystem/types/ObjectGraphTraversal.java
+3
-1
ObjectGraphWalker.java
.../org/apache/atlas/typesystem/types/ObjectGraphWalker.java
+3
-1
Atlas.java
webapp/src/main/java/org/apache/atlas/Atlas.java
+5
-2
QuickStart.java
...p/src/main/java/org/apache/atlas/examples/QuickStart.java
+1
-1
CredentialProviderUtility.java
...java/org/apache/atlas/util/CredentialProviderUtility.java
+34
-27
UserDao.java
webapp/src/main/java/org/apache/atlas/web/dao/UserDao.java
+13
-1
EntityResource.java
...n/java/org/apache/atlas/web/resources/EntityResource.java
+5
-5
MetadataDiscoveryResource.java
...apache/atlas/web/resources/MetadataDiscoveryResource.java
+6
-6
AtlasAuthenticationProvider.java
...pache/atlas/web/security/AtlasAuthenticationProvider.java
+13
-16
No files found.
addons/falcon-bridge/src/main/java/org/apache/atlas/falcon/bridge/FalconBridge.java
View file @
e15629c2
...
@@ -28,6 +28,7 @@ import org.apache.atlas.hive.bridge.HiveMetaStoreBridge;
...
@@ -28,6 +28,7 @@ import org.apache.atlas.hive.bridge.HiveMetaStoreBridge;
import
org.apache.atlas.hive.model.HiveDataModelGenerator
;
import
org.apache.atlas.hive.model.HiveDataModelGenerator
;
import
org.apache.atlas.hive.model.HiveDataTypes
;
import
org.apache.atlas.hive.model.HiveDataTypes
;
import
org.apache.atlas.typesystem.Referenceable
;
import
org.apache.atlas.typesystem.Referenceable
;
import
org.apache.commons.collections.CollectionUtils
;
import
org.apache.commons.lang3.StringUtils
;
import
org.apache.commons.lang3.StringUtils
;
import
org.apache.falcon.entity.CatalogStorage
;
import
org.apache.falcon.entity.CatalogStorage
;
import
org.apache.falcon.entity.FeedHelper
;
import
org.apache.falcon.entity.FeedHelper
;
...
@@ -284,18 +285,26 @@ public class FalconBridge {
...
@@ -284,18 +285,26 @@ public class FalconBridge {
Feed
feed
)
throws
Exception
{
Feed
feed
)
throws
Exception
{
org
.
apache
.
falcon
.
entity
.
v0
.
feed
.
Cluster
feedCluster
=
FeedHelper
.
getCluster
(
feed
,
cluster
.
getName
());
org
.
apache
.
falcon
.
entity
.
v0
.
feed
.
Cluster
feedCluster
=
FeedHelper
.
getCluster
(
feed
,
cluster
.
getName
());
final
CatalogTable
table
=
getTable
(
feedCluster
,
feed
);
if
(
feedCluster
!=
null
)
{
if
(
table
!=
null
)
{
final
CatalogTable
table
=
getTable
(
feedCluster
,
feed
);
CatalogStorage
storage
=
new
CatalogStorage
(
cluster
,
table
);
if
(
table
!=
null
)
{
return
createHiveTableInstance
(
cluster
.
getName
(),
storage
.
getDatabase
().
toLowerCase
(),
CatalogStorage
storage
=
new
CatalogStorage
(
cluster
,
table
);
storage
.
getTable
().
toLowerCase
());
return
createHiveTableInstance
(
cluster
.
getName
(),
storage
.
getDatabase
().
toLowerCase
(),
}
else
{
storage
.
getTable
().
toLowerCase
());
List
<
Location
>
locations
=
FeedHelper
.
getLocations
(
feedCluster
,
feed
);
}
else
{
Location
dataLocation
=
FileSystemStorage
.
getLocation
(
locations
,
LocationType
.
DATA
);
List
<
Location
>
locations
=
FeedHelper
.
getLocations
(
feedCluster
,
feed
);
final
String
pathUri
=
normalize
(
dataLocation
.
getPath
());
if
(
CollectionUtils
.
isNotEmpty
(
locations
))
{
LOG
.
info
(
"Registering DFS Path {} "
,
pathUri
);
Location
dataLocation
=
FileSystemStorage
.
getLocation
(
locations
,
LocationType
.
DATA
);
return
fillHDFSDataSet
(
pathUri
,
cluster
.
getName
());
if
(
dataLocation
!=
null
)
{
final
String
pathUri
=
normalize
(
dataLocation
.
getPath
());
LOG
.
info
(
"Registering DFS Path {} "
,
pathUri
);
return
fillHDFSDataSet
(
pathUri
,
cluster
.
getName
());
}
}
}
}
}
return
null
;
}
}
private
static
CatalogTable
getTable
(
org
.
apache
.
falcon
.
entity
.
v0
.
feed
.
Cluster
cluster
,
Feed
feed
)
{
private
static
CatalogTable
getTable
(
org
.
apache
.
falcon
.
entity
.
v0
.
feed
.
Cluster
cluster
,
Feed
feed
)
{
...
...
addons/hive-bridge/src/main/java/org/apache/atlas/hive/hook/HiveHook.java
View file @
e15629c2
...
@@ -56,6 +56,7 @@ import org.slf4j.Logger;
...
@@ -56,6 +56,7 @@ import org.slf4j.Logger;
import
org.slf4j.LoggerFactory
;
import
org.slf4j.LoggerFactory
;
import
java.net.MalformedURLException
;
import
java.net.MalformedURLException
;
import
java.net.URI
;
import
java.util.ArrayList
;
import
java.util.ArrayList
;
import
java.util.Comparator
;
import
java.util.Comparator
;
import
java.util.Date
;
import
java.util.Date
;
...
@@ -491,37 +492,42 @@ public class HiveHook extends AtlasHook implements ExecuteWithHookContext {
...
@@ -491,37 +492,42 @@ public class HiveHook extends AtlasHook implements ExecuteWithHookContext {
table
=
partition
.
getTable
();
table
=
partition
.
getTable
();
db
=
dgiBridge
.
hiveClient
.
getDatabase
(
table
.
getDbName
());
db
=
dgiBridge
.
hiveClient
.
getDatabase
(
table
.
getDbName
());
break
;
break
;
default
:
LOG
.
info
(
"{}: entity-type not handled by Atlas hook. Ignored"
,
entity
.
getType
());
}
}
db
=
dgiBridge
.
hiveClient
.
getDatabase
(
db
.
getName
());
if
(
db
!=
null
)
{
Referenceable
dbEntity
=
dgiBridge
.
createDBInstance
(
db
);
db
=
dgiBridge
.
hiveClient
.
getDatabase
(
db
.
getName
());
Referenceable
dbEntity
=
dgiBridge
.
createDBInstance
(
db
);
entities
.
add
(
dbEntity
);
entities
.
add
(
dbEntity
);
result
.
put
(
Type
.
DATABASE
,
dbEntity
);
result
.
put
(
Type
.
DATABASE
,
dbEntity
);
Referenceable
tableEntity
=
null
;
Referenceable
tableEntity
=
null
;
if
(
table
!=
null
)
{
if
(
table
!=
null
)
{
if
(
existTable
!=
null
)
{
if
(
existTable
!=
null
)
{
table
=
existTable
;
table
=
existTable
;
}
else
{
}
else
{
table
=
dgiBridge
.
hiveClient
.
getTable
(
table
.
getDbName
(),
table
.
getTableName
());
table
=
dgiBridge
.
hiveClient
.
getTable
(
table
.
getDbName
(),
table
.
getTableName
());
}
//If its an external table, even though the temp table skip flag is on,
// we create the table since we need the HDFS path to temp table lineage.
if
(
skipTempTables
&&
table
.
isTemporary
()
&&
!
TableType
.
EXTERNAL_TABLE
.
equals
(
table
.
getTableType
()))
{
LOG
.
debug
(
"Skipping temporary table registration {} since it is not an external table {} "
,
table
.
getTableName
(),
table
.
getTableType
().
name
());
}
else
{
tableEntity
=
dgiBridge
.
createTableInstance
(
dbEntity
,
table
);
entities
.
add
(
tableEntity
);
result
.
put
(
Type
.
TABLE
,
tableEntity
);
}
}
}
//If its an external table, even though the temp table skip flag is on,
// we create the table since we need the HDFS path to temp table lineage.
if
(
skipTempTables
&&
table
.
isTemporary
()
&&
!
TableType
.
EXTERNAL_TABLE
.
equals
(
table
.
getTableType
()))
{
LOG
.
debug
(
"Skipping temporary table registration {} since it is not an external table {} "
,
table
.
getTableName
(),
table
.
getTableType
().
name
());
}
else
{
event
.
addMessage
(
new
HookNotification
.
EntityUpdateRequest
(
event
.
getUser
(),
entities
));
tableEntity
=
dgiBridge
.
createTableInstance
(
dbEntity
,
table
);
entities
.
add
(
tableEntity
);
result
.
put
(
Type
.
TABLE
,
tableEntity
);
}
}
}
event
.
addMessage
(
new
HookNotification
.
EntityUpdateRequest
(
event
.
getUser
(),
entities
));
return
result
;
return
result
;
}
}
...
@@ -620,13 +626,16 @@ public class HiveHook extends AtlasHook implements ExecuteWithHookContext {
...
@@ -620,13 +626,16 @@ public class HiveHook extends AtlasHook implements ExecuteWithHookContext {
entities
.
addAll
(
result
.
values
());
entities
.
addAll
(
result
.
values
());
}
}
}
else
if
(
entity
.
getType
()
==
Type
.
DFS_DIR
)
{
}
else
if
(
entity
.
getType
()
==
Type
.
DFS_DIR
)
{
final
String
pathUri
=
lower
(
new
Path
(
entity
.
getLocation
()).
toString
());
URI
location
=
entity
.
getLocation
();
LOG
.
debug
(
"Registering DFS Path {} "
,
pathUri
);
if
(
location
!=
null
)
{
if
(!
dataSetsProcessed
.
contains
(
pathUri
))
{
final
String
pathUri
=
lower
(
new
Path
(
location
).
toString
());
Referenceable
hdfsPath
=
dgiBridge
.
fillHDFSDataSet
(
pathUri
);
LOG
.
debug
(
"Registering DFS Path {} "
,
pathUri
);
dataSets
.
put
(
entity
,
hdfsPath
);
if
(!
dataSetsProcessed
.
contains
(
pathUri
))
{
dataSetsProcessed
.
add
(
pathUri
);
Referenceable
hdfsPath
=
dgiBridge
.
fillHDFSDataSet
(
pathUri
);
entities
.
add
(
hdfsPath
);
dataSets
.
put
(
entity
,
hdfsPath
);
dataSetsProcessed
.
add
(
pathUri
);
entities
.
add
(
hdfsPath
);
}
}
}
}
}
}
}
...
@@ -666,13 +675,17 @@ public class HiveHook extends AtlasHook implements ExecuteWithHookContext {
...
@@ -666,13 +675,17 @@ public class HiveHook extends AtlasHook implements ExecuteWithHookContext {
private
void
handleExternalTables
(
final
HiveMetaStoreBridge
dgiBridge
,
final
HiveEventContext
event
,
final
LinkedHashMap
<
Type
,
Referenceable
>
tables
)
throws
HiveException
,
MalformedURLException
{
private
void
handleExternalTables
(
final
HiveMetaStoreBridge
dgiBridge
,
final
HiveEventContext
event
,
final
LinkedHashMap
<
Type
,
Referenceable
>
tables
)
throws
HiveException
,
MalformedURLException
{
List
<
Referenceable
>
entities
=
new
ArrayList
<>();
List
<
Referenceable
>
entities
=
new
ArrayList
<>();
final
WriteEntity
hiveEntity
=
(
WriteEntity
)
getEntityByType
(
event
.
getOutputs
(),
Type
.
TABLE
);
final
WriteEntity
hiveEntity
=
(
WriteEntity
)
getEntityByType
(
event
.
getOutputs
(),
Type
.
TABLE
);
Table
hiveTable
=
hiveEntity
.
getTable
();
Table
hiveTable
=
hiveEntity
==
null
?
null
:
hiveEntity
.
getTable
();
//Refresh to get the correct location
//Refresh to get the correct location
hiveTable
=
dgiBridge
.
hiveClient
.
getTable
(
hiveTable
.
getDbName
(),
hiveTable
.
getTableName
());
if
(
hiveTable
!=
null
)
{
hiveTable
=
dgiBridge
.
hiveClient
.
getTable
(
hiveTable
.
getDbName
(),
hiveTable
.
getTableName
());
}
final
String
location
=
lower
(
hiveTable
.
getDataLocation
().
toString
());
if
(
hiveTable
!=
null
&&
TableType
.
EXTERNAL_TABLE
.
equals
(
hiveTable
.
getTableType
()))
{
if
(
hiveTable
!=
null
&&
TableType
.
EXTERNAL_TABLE
.
equals
(
hiveTable
.
getTableType
()))
{
LOG
.
info
(
"Registering external table process {} "
,
event
.
getQueryStr
());
LOG
.
info
(
"Registering external table process {} "
,
event
.
getQueryStr
());
final
String
location
=
lower
(
hiveTable
.
getDataLocation
().
toString
());
final
ReadEntity
dfsEntity
=
new
ReadEntity
();
final
ReadEntity
dfsEntity
=
new
ReadEntity
();
dfsEntity
.
setTyp
(
Type
.
DFS_DIR
);
dfsEntity
.
setTyp
(
Type
.
DFS_DIR
);
dfsEntity
.
setName
(
location
);
dfsEntity
.
setName
(
location
);
...
@@ -702,6 +715,7 @@ public class HiveHook extends AtlasHook implements ExecuteWithHookContext {
...
@@ -702,6 +715,7 @@ public class HiveHook extends AtlasHook implements ExecuteWithHookContext {
entities
.
add
(
processReferenceable
);
entities
.
add
(
processReferenceable
);
event
.
addMessage
(
new
HookNotification
.
EntityUpdateRequest
(
event
.
getUser
(),
entities
));
event
.
addMessage
(
new
HookNotification
.
EntityUpdateRequest
(
event
.
getUser
(),
entities
));
}
}
}
}
private
boolean
isCreateOp
(
HiveEventContext
hiveEvent
)
{
private
boolean
isCreateOp
(
HiveEventContext
hiveEvent
)
{
...
...
authorization/src/main/java/org/apache/atlas/authorize/simple/PolicyParser.java
View file @
e15629c2
...
@@ -161,12 +161,7 @@ public class PolicyParser {
...
@@ -161,12 +161,7 @@ public class PolicyParser {
if
(
def
.
getUsers
()
!=
null
)
{
if
(
def
.
getUsers
()
!=
null
)
{
usersMap
=
def
.
getUsers
();
usersMap
=
def
.
getUsers
();
}
}
List
<
AtlasActionTypes
>
userAutorities
=
usersMap
.
get
(
userAndRole
[
USERNAME
]);
List
<
AtlasActionTypes
>
userAutorities
=
getListOfAutorities
(
userAndRole
[
USER_AUTHORITIES
]);
if
(
userAutorities
==
null
)
{
userAutorities
=
new
ArrayList
<
AtlasActionTypes
>();
}
userAutorities
=
getListOfAutorities
(
userAndRole
[
USER_AUTHORITIES
]);
usersMap
.
put
(
userAndRole
[
USERNAME
],
userAutorities
);
usersMap
.
put
(
userAndRole
[
USERNAME
],
userAutorities
);
def
.
setUsers
(
usersMap
);
def
.
setUsers
(
usersMap
);
}
}
...
@@ -195,11 +190,7 @@ public class PolicyParser {
...
@@ -195,11 +190,7 @@ public class PolicyParser {
if
(
def
.
getGroups
()
!=
null
)
{
if
(
def
.
getGroups
()
!=
null
)
{
groupsMap
=
def
.
getGroups
();
groupsMap
=
def
.
getGroups
();
}
}
List
<
AtlasActionTypes
>
groupAutorities
=
groupsMap
.
get
(
groupAndRole
[
GROUPNAME
]);
List
<
AtlasActionTypes
>
groupAutorities
=
getListOfAutorities
(
groupAndRole
[
GROUP_AUTHORITIES
]);
if
(
groupAutorities
==
null
)
{
groupAutorities
=
new
ArrayList
<
AtlasActionTypes
>();
}
groupAutorities
=
getListOfAutorities
(
groupAndRole
[
GROUP_AUTHORITIES
]);
groupsMap
.
put
(
groupAndRole
[
GROUPNAME
],
groupAutorities
);
groupsMap
.
put
(
groupAndRole
[
GROUPNAME
],
groupAutorities
);
def
.
setGroups
(
groupsMap
);
def
.
setGroups
(
groupsMap
);
}
}
...
...
authorization/src/main/java/org/apache/atlas/authorize/simple/SimpleAtlasAuthorizer.java
View file @
e15629c2
...
@@ -32,6 +32,7 @@ import org.apache.atlas.authorize.AtlasAuthorizationException;
...
@@ -32,6 +32,7 @@ import org.apache.atlas.authorize.AtlasAuthorizationException;
import
org.apache.atlas.authorize.AtlasAuthorizer
;
import
org.apache.atlas.authorize.AtlasAuthorizer
;
import
org.apache.atlas.authorize.AtlasResourceTypes
;
import
org.apache.atlas.authorize.AtlasResourceTypes
;
import
org.apache.atlas.utils.PropertiesUtil
;
import
org.apache.atlas.utils.PropertiesUtil
;
import
org.apache.commons.collections.CollectionUtils
;
import
org.apache.commons.configuration.Configuration
;
import
org.apache.commons.configuration.Configuration
;
import
org.apache.commons.io.FilenameUtils
;
import
org.apache.commons.io.FilenameUtils
;
import
org.apache.commons.io.IOCase
;
import
org.apache.commons.io.IOCase
;
...
@@ -224,10 +225,12 @@ public final class SimpleAtlasAuthorizer implements AtlasAuthorizer {
...
@@ -224,10 +225,12 @@ public final class SimpleAtlasAuthorizer implements AtlasAuthorizer {
LOG
.
debug
(
"==> SimpleAtlasAuthorizer checkAccessForGroups"
);
LOG
.
debug
(
"==> SimpleAtlasAuthorizer checkAccessForGroups"
);
}
}
for
(
String
group
:
groups
)
{
if
(
CollectionUtils
.
isNotEmpty
(
groups
))
{
isAccessAllowed
=
checkAccess
(
group
,
resourceType
,
resource
,
map
);
for
(
String
group
:
groups
)
{
if
(
isAccessAllowed
)
{
isAccessAllowed
=
checkAccess
(
group
,
resourceType
,
resource
,
map
);
break
;
if
(
isAccessAllowed
)
{
break
;
}
}
}
}
}
...
...
client/src/main/java/org/apache/atlas/AtlasClient.java
View file @
e15629c2
...
@@ -144,8 +144,15 @@ public class AtlasClient {
...
@@ -144,8 +144,15 @@ public class AtlasClient {
// New constuctor for Basic auth
// New constuctor for Basic auth
public
AtlasClient
(
String
[]
baseUrl
,
String
[]
basicAuthUserNamepassword
)
{
public
AtlasClient
(
String
[]
baseUrl
,
String
[]
basicAuthUserNamepassword
)
{
this
.
basicAuthUser
=
basicAuthUserNamepassword
[
0
];
if
(
basicAuthUserNamepassword
!=
null
)
{
this
.
basicAuthPassword
=
basicAuthUserNamepassword
[
1
];
if
(
basicAuthUserNamepassword
.
length
>
0
)
{
this
.
basicAuthUser
=
basicAuthUserNamepassword
[
0
];
}
if
(
basicAuthUserNamepassword
.
length
>
1
)
{
this
.
basicAuthPassword
=
basicAuthUserNamepassword
[
1
];
}
}
initializeState
(
baseUrl
,
null
,
null
);
initializeState
(
baseUrl
,
null
,
null
);
}
}
...
@@ -1119,7 +1126,8 @@ public class AtlasClient {
...
@@ -1119,7 +1126,8 @@ public class AtlasClient {
private
JSONObject
callAPIWithResource
(
API
api
,
WebResource
resource
,
Object
requestObject
)
private
JSONObject
callAPIWithResource
(
API
api
,
WebResource
resource
,
Object
requestObject
)
throws
AtlasServiceException
{
throws
AtlasServiceException
{
ClientResponse
clientResponse
=
null
;
ClientResponse
clientResponse
=
null
;
for
(
int
i
=
0
;
i
<
getNumberOfRetries
();
i
++)
{
int
i
=
0
;
do
{
clientResponse
=
resource
.
accept
(
JSON_MEDIA_TYPE
).
type
(
JSON_MEDIA_TYPE
)
clientResponse
=
resource
.
accept
(
JSON_MEDIA_TYPE
).
type
(
JSON_MEDIA_TYPE
)
.
method
(
api
.
getMethod
(),
ClientResponse
.
class
,
requestObject
);
.
method
(
api
.
getMethod
(),
ClientResponse
.
class
,
requestObject
);
...
@@ -1137,7 +1145,9 @@ public class AtlasClient {
...
@@ -1137,7 +1145,9 @@ public class AtlasClient {
LOG
.
error
(
"Got a service unavailable when calling: {}, will retry.."
,
resource
);
LOG
.
error
(
"Got a service unavailable when calling: {}, will retry.."
,
resource
);
sleepBetweenRetries
();
sleepBetweenRetries
();
}
}
}
i
++;
}
while
(
i
<
getNumberOfRetries
());
throw
new
AtlasServiceException
(
api
,
clientResponse
);
throw
new
AtlasServiceException
(
api
,
clientResponse
);
}
}
...
...
client/src/main/java/org/apache/atlas/AtlasServiceException.java
View file @
e15629c2
...
@@ -37,8 +37,8 @@ public class AtlasServiceException extends Exception {
...
@@ -37,8 +37,8 @@ public class AtlasServiceException extends Exception {
}
}
private
AtlasServiceException
(
AtlasClient
.
API
api
,
ClientResponse
.
Status
status
,
String
response
)
{
private
AtlasServiceException
(
AtlasClient
.
API
api
,
ClientResponse
.
Status
status
,
String
response
)
{
super
(
"Metadata service API "
+
api
+
" failed with status "
+
status
.
getStatusCode
()
+
"("
+
super
(
"Metadata service API "
+
api
+
" failed with status "
+
(
status
!=
null
?
status
.
getStatusCode
()
:
-
1
)
status
.
getReasonPhrase
()
+
") Response Body ("
+
response
+
")"
);
+
" ("
+
status
+
") Response Body ("
+
response
+
")"
);
this
.
status
=
status
;
this
.
status
=
status
;
}
}
...
...
common/src/main/java/org/apache/atlas/utils/ParamChecker.java
View file @
e15629c2
...
@@ -139,20 +139,6 @@ public final class ParamChecker {
...
@@ -139,20 +139,6 @@ public final class ParamChecker {
}
}
/**
/**
* Check that a list is not null and that none of its elements is null. If null or if the list has emtpy elements
* throws an IllegalArgumentException.
* @param list the list of strings.
* @param name parameter name for the exception message.
*/
public
static
Collection
<
String
>
notEmptyElements
(
Collection
<
String
>
list
,
String
name
)
{
notEmpty
(
list
,
name
);
for
(
String
ele
:
list
)
{
notEmpty
(
ele
,
String
.
format
(
"list %s element %s"
,
name
,
ele
));
}
return
list
;
}
/**
* Checks that the given value is <= max value.
* Checks that the given value is <= max value.
* @param value
* @param value
* @param maxValue
* @param maxValue
...
...
notification/src/main/java/org/apache/atlas/hook/FailedMessagesLogger.java
View file @
e15629c2
...
@@ -77,13 +77,15 @@ public class FailedMessagesLogger {
...
@@ -77,13 +77,15 @@ public class FailedMessagesLogger {
org
.
apache
.
log4j
.
Logger
rootLogger
=
org
.
apache
.
log4j
.
Logger
.
getRootLogger
();
org
.
apache
.
log4j
.
Logger
rootLogger
=
org
.
apache
.
log4j
.
Logger
.
getRootLogger
();
Enumeration
allAppenders
=
rootLogger
.
getAllAppenders
();
Enumeration
allAppenders
=
rootLogger
.
getAllAppenders
();
while
(
allAppenders
.
hasMoreElements
())
{
if
(
allAppenders
!=
null
)
{
Appender
appender
=
(
Appender
)
allAppenders
.
nextElement
();
while
(
allAppenders
.
hasMoreElements
())
{
if
(
appender
instanceof
FileAppender
)
{
Appender
appender
=
(
Appender
)
allAppenders
.
nextElement
();
FileAppender
fileAppender
=
(
FileAppender
)
appender
;
if
(
appender
instanceof
FileAppender
)
{
String
rootLoggerFile
=
fileAppender
.
getFile
();
FileAppender
fileAppender
=
(
FileAppender
)
appender
;
rootLoggerDirectory
=
new
File
(
rootLoggerFile
).
getParent
();
String
rootLoggerFile
=
fileAppender
.
getFile
();
break
;
rootLoggerDirectory
=
new
File
(
rootLoggerFile
).
getParent
();
break
;
}
}
}
}
}
return
rootLoggerDirectory
;
return
rootLoggerDirectory
;
...
...
release-log.txt
View file @
e15629c2
...
@@ -6,6 +6,7 @@ INCOMPATIBLE CHANGES:
...
@@ -6,6 +6,7 @@ INCOMPATIBLE CHANGES:
ALL CHANGES:
ALL CHANGES:
ATLAS-1033 fix for issues flagged by Coverity scan (mneethiraj via sumasai)
ATLAS-1036 Compilation error on java 1.8 - GraphBackedDiscoveryService (shwethags via sumasai)
ATLAS-1036 Compilation error on java 1.8 - GraphBackedDiscoveryService (shwethags via sumasai)
ATLAS-1034 Incorrect Falcon hook impl class name in Falcon hook shim (mneethiraj via shwethags)
ATLAS-1034 Incorrect Falcon hook impl class name in Falcon hook shim (mneethiraj via shwethags)
ATLAS-347 Atlas search APIs should allow pagination of results (shwethags)
ATLAS-347 Atlas search APIs should allow pagination of results (shwethags)
...
...
repository/src/main/java/org/apache/atlas/discovery/DataSetLineageService.java
View file @
e15629c2
...
@@ -101,7 +101,7 @@ public class DataSetLineageService implements LineageService {
...
@@ -101,7 +101,7 @@ public class DataSetLineageService implements LineageService {
@GraphTransaction
@GraphTransaction
public
String
getOutputsGraph
(
String
datasetName
)
throws
AtlasException
{
public
String
getOutputsGraph
(
String
datasetName
)
throws
AtlasException
{
LOG
.
info
(
"Fetching lineage outputs graph for datasetName={}"
,
datasetName
);
LOG
.
info
(
"Fetching lineage outputs graph for datasetName={}"
,
datasetName
);
ParamChecker
.
notEmpty
(
datasetName
,
"dataset name"
);
datasetName
=
ParamChecker
.
notEmpty
(
datasetName
,
"dataset name"
);
ReferenceableInstance
datasetInstance
=
validateDatasetNameExists
(
datasetName
);
ReferenceableInstance
datasetInstance
=
validateDatasetNameExists
(
datasetName
);
return
getOutputsGraphForId
(
datasetInstance
.
getId
().
_getId
());
return
getOutputsGraphForId
(
datasetInstance
.
getId
().
_getId
());
}
}
...
@@ -116,7 +116,7 @@ public class DataSetLineageService implements LineageService {
...
@@ -116,7 +116,7 @@ public class DataSetLineageService implements LineageService {
@GraphTransaction
@GraphTransaction
public
String
getInputsGraph
(
String
tableName
)
throws
AtlasException
{
public
String
getInputsGraph
(
String
tableName
)
throws
AtlasException
{
LOG
.
info
(
"Fetching lineage inputs graph for tableName={}"
,
tableName
);
LOG
.
info
(
"Fetching lineage inputs graph for tableName={}"
,
tableName
);
ParamChecker
.
notEmpty
(
tableName
,
"table name"
);
tableName
=
ParamChecker
.
notEmpty
(
tableName
,
"table name"
);
ReferenceableInstance
datasetInstance
=
validateDatasetNameExists
(
tableName
);
ReferenceableInstance
datasetInstance
=
validateDatasetNameExists
(
tableName
);
return
getInputsGraphForId
(
datasetInstance
.
getId
().
_getId
());
return
getInputsGraphForId
(
datasetInstance
.
getId
().
_getId
());
}
}
...
@@ -125,7 +125,7 @@ public class DataSetLineageService implements LineageService {
...
@@ -125,7 +125,7 @@ public class DataSetLineageService implements LineageService {
@GraphTransaction
@GraphTransaction
public
String
getInputsGraphForEntity
(
String
guid
)
throws
AtlasException
{
public
String
getInputsGraphForEntity
(
String
guid
)
throws
AtlasException
{
LOG
.
info
(
"Fetching lineage inputs graph for entity={}"
,
guid
);
LOG
.
info
(
"Fetching lineage inputs graph for entity={}"
,
guid
);
ParamChecker
.
notEmpty
(
guid
,
"Entity id"
);
guid
=
ParamChecker
.
notEmpty
(
guid
,
"Entity id"
);
validateDatasetExists
(
guid
);
validateDatasetExists
(
guid
);
return
getInputsGraphForId
(
guid
);
return
getInputsGraphForId
(
guid
);
}
}
...
@@ -143,7 +143,7 @@ public class DataSetLineageService implements LineageService {
...
@@ -143,7 +143,7 @@ public class DataSetLineageService implements LineageService {
@GraphTransaction
@GraphTransaction
public
String
getOutputsGraphForEntity
(
String
guid
)
throws
AtlasException
{
public
String
getOutputsGraphForEntity
(
String
guid
)
throws
AtlasException
{
LOG
.
info
(
"Fetching lineage outputs graph for entity guid={}"
,
guid
);
LOG
.
info
(
"Fetching lineage outputs graph for entity guid={}"
,
guid
);
ParamChecker
.
notEmpty
(
guid
,
"Entity id"
);
guid
=
ParamChecker
.
notEmpty
(
guid
,
"Entity id"
);
validateDatasetExists
(
guid
);
validateDatasetExists
(
guid
);
return
getOutputsGraphForId
(
guid
);
return
getOutputsGraphForId
(
guid
);
}
}
...
@@ -165,7 +165,7 @@ public class DataSetLineageService implements LineageService {
...
@@ -165,7 +165,7 @@ public class DataSetLineageService implements LineageService {
@Override
@Override
@GraphTransaction
@GraphTransaction
public
String
getSchema
(
String
datasetName
)
throws
AtlasException
{
public
String
getSchema
(
String
datasetName
)
throws
AtlasException
{
ParamChecker
.
notEmpty
(
datasetName
,
"table name"
);
datasetName
=
ParamChecker
.
notEmpty
(
datasetName
,
"table name"
);
LOG
.
info
(
"Fetching schema for tableName={}"
,
datasetName
);
LOG
.
info
(
"Fetching schema for tableName={}"
,
datasetName
);
ReferenceableInstance
datasetInstance
=
validateDatasetNameExists
(
datasetName
);
ReferenceableInstance
datasetInstance
=
validateDatasetNameExists
(
datasetName
);
...
@@ -182,7 +182,7 @@ public class DataSetLineageService implements LineageService {
...
@@ -182,7 +182,7 @@ public class DataSetLineageService implements LineageService {
@Override
@Override
@GraphTransaction
@GraphTransaction
public
String
getSchemaForEntity
(
String
guid
)
throws
AtlasException
{
public
String
getSchemaForEntity
(
String
guid
)
throws
AtlasException
{
ParamChecker
.
notEmpty
(
guid
,
"Entity id"
);
guid
=
ParamChecker
.
notEmpty
(
guid
,
"Entity id"
);
LOG
.
info
(
"Fetching schema for entity guid={}"
,
guid
);
LOG
.
info
(
"Fetching schema for entity guid={}"
,
guid
);
String
typeName
=
validateDatasetExists
(
guid
);
String
typeName
=
validateDatasetExists
(
guid
);
return
getSchemaForId
(
typeName
,
guid
);
return
getSchemaForId
(
typeName
,
guid
);
...
...
repository/src/main/java/org/apache/atlas/discovery/graph/GraphBackedDiscoveryService.java
View file @
e15629c2
...
@@ -167,6 +167,11 @@ public class GraphBackedDiscoveryService implements DiscoveryService {
...
@@ -167,6 +167,11 @@ public class GraphBackedDiscoveryService implements DiscoveryService {
LOG
.
info
(
"Executing gremlin query={}"
,
gremlinQuery
);
LOG
.
info
(
"Executing gremlin query={}"
,
gremlinQuery
);
ScriptEngineManager
manager
=
new
ScriptEngineManager
();
ScriptEngineManager
manager
=
new
ScriptEngineManager
();
ScriptEngine
engine
=
manager
.
getEngineByName
(
"gremlin-groovy"
);
ScriptEngine
engine
=
manager
.
getEngineByName
(
"gremlin-groovy"
);
if
(
engine
==
null
)
{
throw
new
DiscoveryException
(
"gremlin-groovy: engine not found"
);
}
Bindings
bindings
=
engine
.
createBindings
();
Bindings
bindings
=
engine
.
createBindings
();
bindings
.
put
(
"g"
,
titanGraph
);
bindings
.
put
(
"g"
,
titanGraph
);
...
...
repository/src/main/java/org/apache/atlas/repository/graph/DeleteHandler.java
View file @
e15629c2
...
@@ -334,28 +334,29 @@ public abstract class DeleteHandler {
...
@@ -334,28 +334,29 @@ public abstract class DeleteHandler {
String
keyPropertyName
=
GraphHelper
.
getQualifiedNameForMapKey
(
propertyName
,
key
);
String
keyPropertyName
=
GraphHelper
.
getQualifiedNameForMapKey
(
propertyName
,
key
);
String
mapEdgeId
=
GraphHelper
.
getProperty
(
outVertex
,
keyPropertyName
);
String
mapEdgeId
=
GraphHelper
.
getProperty
(
outVertex
,
keyPropertyName
);
Edge
mapEdge
=
graphHelper
.
getEdgeByEdgeId
(
outVertex
,
keyPropertyName
,
mapEdgeId
);
Edge
mapEdge
=
graphHelper
.
getEdgeByEdgeId
(
outVertex
,
keyPropertyName
,
mapEdgeId
);
Vertex
mapVertex
=
mapEdge
.
getVertex
(
Direction
.
IN
);
if
(
mapEdge
!=
null
)
{
if
(
mapVertex
.
getId
().
toString
().
equals
(
inVertex
.
getId
().
toString
()))
{
Vertex
mapVertex
=
mapEdge
.
getVertex
(
Direction
.
IN
);
//TODO keys.size includes deleted items as well. should exclude
if
(
mapVertex
.
getId
().
toString
().
equals
(
inVertex
.
getId
().
toString
()))
{
if
(
attributeInfo
.
multiplicity
.
nullAllowed
()
||
keys
.
size
()
>
attributeInfo
.
multiplicity
.
lower
)
{
//TODO keys.size includes deleted items as well. should exclude
edge
=
mapEdge
;
if
(
attributeInfo
.
multiplicity
.
nullAllowed
()
||
keys
.
size
()
>
attributeInfo
.
multiplicity
.
lower
)
{
}
edge
=
mapEdge
;
else
{
}
else
{
// Deleting this entry would violate the attribute's lower bound.
// Deleting this entry would violate the attribute's lower bound.
throw
new
NullRequiredAttributeException
(
throw
new
NullRequiredAttributeException
(
"Cannot remove map entry "
+
keyPropertyName
+
" from required attribute "
+
"Cannot remove map entry "
+
keyPropertyName
+
" from required attribute "
+
GraphHelper
.
getQualifiedFieldName
(
type
,
attributeName
)
+
" on "
+
string
(
outVertex
)
+
" "
+
string
(
mapEdge
));
GraphHelper
.
getQualifiedFieldName
(
type
,
attributeName
)
+
" on "
+
string
(
outVertex
)
+
" "
+
string
(
mapEdge
));
}
}
if
(
shouldUpdateReverseAttribute
)
{
if
(
shouldUpdateReverseAttribute
)
{
//remove this key
//remove this key
LOG
.
debug
(
"Removing edge {}, key {} from the map attribute {}"
,
string
(
mapEdge
),
key
,
LOG
.
debug
(
"Removing edge {}, key {} from the map attribute {}"
,
string
(
mapEdge
),
key
,
attributeName
);
attributeName
);
keys
.
remove
(
key
);
keys
.
remove
(
key
);
GraphHelper
.
setProperty
(
outVertex
,
propertyName
,
keys
);
GraphHelper
.
setProperty
(
outVertex
,
propertyName
,
keys
);
GraphHelper
.
setProperty
(
outVertex
,
keyPropertyName
,
null
);
GraphHelper
.
setProperty
(
outVertex
,
keyPropertyName
,
null
);
}
break
;
}
}
break
;
}
}
}
}
}
}
...
...
repository/src/main/java/org/apache/atlas/repository/graph/GraphBackedMetadataRepository.java
View file @
e15629c2
...
@@ -268,11 +268,13 @@ public class GraphBackedMetadataRepository implements MetadataRepository {
...
@@ -268,11 +268,13 @@ public class GraphBackedMetadataRepository implements MetadataRepository {
final
String
entityTypeName
=
GraphHelper
.
getTypeName
(
instanceVertex
);
final
String
entityTypeName
=
GraphHelper
.
getTypeName
(
instanceVertex
);
String
relationshipLabel
=
GraphHelper
.
getTraitLabel
(
entityTypeName
,
traitNameToBeDeleted
);
String
relationshipLabel
=
GraphHelper
.
getTraitLabel
(
entityTypeName
,
traitNameToBeDeleted
);
Edge
edge
=
GraphHelper
.
getEdgeForLabel
(
instanceVertex
,
relationshipLabel
);
Edge
edge
=
GraphHelper
.
getEdgeForLabel
(
instanceVertex
,
relationshipLabel
);
deleteHandler
.
deleteEdgeReference
(
edge
,
DataTypes
.
TypeCategory
.
TRAIT
,
false
,
true
);
if
(
edge
!=
null
)
{
deleteHandler
.
deleteEdgeReference
(
edge
,
DataTypes
.
TypeCategory
.
TRAIT
,
false
,
true
);
// update the traits in entity once trait removal is successful
// update the traits in entity once trait removal is successful
traitNames
.
remove
(
traitNameToBeDeleted
);
traitNames
.
remove
(
traitNameToBeDeleted
);
updateTraits
(
instanceVertex
,
traitNames
);
updateTraits
(
instanceVertex
,
traitNames
);
}
}
catch
(
Exception
e
)
{
}
catch
(
Exception
e
)
{
throw
new
RepositoryException
(
e
);
throw
new
RepositoryException
(
e
);
}
}
...
...
repository/src/main/java/org/apache/atlas/repository/graph/GraphHelper.java
View file @
e15629c2
...
@@ -429,20 +429,28 @@ public final class GraphHelper {
...
@@ -429,20 +429,28 @@ public final class GraphHelper {
}
}
public
static
String
string
(
Vertex
vertex
)
{
public
static
String
string
(
Vertex
vertex
)
{
if
(
LOG
.
isDebugEnabled
())
{
if
(
vertex
==
null
)
{
return
String
.
format
(
"vertex[id=%s type=%s guid=%s]"
,
vertex
.
getId
().
toString
(),
getTypeName
(
vertex
),
return
"vertex[null]"
;
getIdFromVertex
(
vertex
));
}
else
{
}
else
{
return
String
.
format
(
"vertex[id=%s]"
,
vertex
.
getId
().
toString
());
if
(
LOG
.
isDebugEnabled
())
{
return
String
.
format
(
"vertex[id=%s type=%s guid=%s]"
,
vertex
.
getId
().
toString
(),
getTypeName
(
vertex
),
getIdFromVertex
(
vertex
));
}
else
{
return
String
.
format
(
"vertex[id=%s]"
,
vertex
.
getId
().
toString
());
}
}
}
}
}
public
static
String
string
(
Edge
edge
)
{
public
static
String
string
(
Edge
edge
)
{
if
(
LOG
.
isDebugEnabled
())
{
if
(
edge
==
null
)
{
return
String
.
format
(
"edge[id=%s label=%s from %s -> to %s]"
,
edge
.
getId
().
toString
(),
edge
.
getLabel
(),
return
"edge[null]"
;
string
(
edge
.
getVertex
(
Direction
.
OUT
)),
string
(
edge
.
getVertex
(
Direction
.
IN
)));
}
else
{
}
else
{
return
String
.
format
(
"edge[id=%s]"
,
edge
.
getId
().
toString
());
if
(
LOG
.
isDebugEnabled
())
{
return
String
.
format
(
"edge[id=%s label=%s from %s -> to %s]"
,
edge
.
getId
().
toString
(),
edge
.
getLabel
(),
string
(
edge
.
getVertex
(
Direction
.
OUT
)),
string
(
edge
.
getVertex
(
Direction
.
IN
)));
}
else
{
return
String
.
format
(
"edge[id=%s]"
,
edge
.
getId
().
toString
());
}
}
}
}
}
...
...
repository/src/main/java/org/apache/atlas/repository/graph/TypedInstanceToGraphMapper.java
View file @
e15629c2
...
@@ -370,10 +370,12 @@ public final class TypedInstanceToGraphMapper {
...
@@ -370,10 +370,12 @@ public final class TypedInstanceToGraphMapper {
if
(!
cloneElements
.
isEmpty
())
{
if
(!
cloneElements
.
isEmpty
())
{
for
(
String
edgeIdForDelete
:
cloneElements
)
{
for
(
String
edgeIdForDelete
:
cloneElements
)
{
Edge
edge
=
graphHelper
.
getEdgeByEdgeId
(
instanceVertex
,
edgeLabel
,
edgeIdForDelete
);
Edge
edge
=
graphHelper
.
getEdgeByEdgeId
(
instanceVertex
,
edgeLabel
,
edgeIdForDelete
);
boolean
deleted
=
deleteHandler
.
deleteEdgeReference
(
edge
,
entryType
.
getTypeCategory
(),
if
(
edge
!=
null
)
{
attributeInfo
.
isComposite
,
true
);
boolean
deleted
=
deleteHandler
.
deleteEdgeReference
(
edge
,
entryType
.
getTypeCategory
(),
if
(!
deleted
)
{
attributeInfo
.
isComposite
,
true
);
additionalElements
.
add
(
edgeIdForDelete
);
if
(!
deleted
)
{
additionalElements
.
add
(
edgeIdForDelete
);
}
}
}
}
}
}
}
...
@@ -454,11 +456,13 @@ public final class TypedInstanceToGraphMapper {
...
@@ -454,11 +456,13 @@ public final class TypedInstanceToGraphMapper {
if
(!
newMap
.
values
().
contains
(
currentEdge
))
{
if
(!
newMap
.
values
().
contains
(
currentEdge
))
{
String
edgeLabel
=
GraphHelper
.
getQualifiedNameForMapKey
(
propertyName
,
currentKey
);
String
edgeLabel
=
GraphHelper
.
getQualifiedNameForMapKey
(
propertyName
,
currentKey
);
Edge
edge
=
graphHelper
.
getEdgeByEdgeId
(
instanceVertex
,
edgeLabel
,
currentMap
.
get
(
currentKey
));
Edge
edge
=
graphHelper
.
getEdgeByEdgeId
(
instanceVertex
,
edgeLabel
,
currentMap
.
get
(
currentKey
));
boolean
deleted
=
if
(
edge
!=
null
)
{
deleteHandler
.
deleteEdgeReference
(
edge
,
elementType
.
getTypeCategory
(),
attributeInfo
.
isComposite
,
true
);
boolean
deleted
=
if
(!
deleted
)
{
deleteHandler
.
deleteEdgeReference
(
edge
,
elementType
.
getTypeCategory
(),
attributeInfo
.
isComposite
,
true
);
additionalMap
.
put
(
currentKey
,
currentEdge
);
if
(!
deleted
)
{
shouldDeleteKey
=
false
;
additionalMap
.
put
(
currentKey
,
currentEdge
);
shouldDeleteKey
=
false
;
}
}
}
}
}
}
}
...
@@ -702,7 +706,9 @@ public final class TypedInstanceToGraphMapper {
...
@@ -702,7 +706,9 @@ public final class TypedInstanceToGraphMapper {
}
else
if
(
attributeInfo
.
dataType
()
==
DataTypes
.
DATE_TYPE
)
{
}
else
if
(
attributeInfo
.
dataType
()
==
DataTypes
.
DATE_TYPE
)
{
final
Date
dateVal
=
typedInstance
.
getDate
(
attributeInfo
.
name
);
final
Date
dateVal
=
typedInstance
.
getDate
(
attributeInfo
.
name
);
//Convert Property value to Long while persisting
//Convert Property value to Long while persisting
propertyValue
=
dateVal
.
getTime
();
if
(
dateVal
!=
null
)
{
propertyValue
=
dateVal
.
getTime
();
}
}
else
if
(
attributeInfo
.
dataType
().
getTypeCategory
()
==
DataTypes
.
TypeCategory
.
ENUM
)
{
}
else
if
(
attributeInfo
.
dataType
().
getTypeCategory
()
==
DataTypes
.
TypeCategory
.
ENUM
)
{
if
(
attrValue
!=
null
)
{
if
(
attrValue
!=
null
)
{
propertyValue
=
((
EnumValue
)
attrValue
).
value
;
propertyValue
=
((
EnumValue
)
attrValue
).
value
;
...
...
repository/src/main/java/org/apache/atlas/repository/typestore/GraphBackedTypeStore.java
View file @
e15629c2
...
@@ -168,16 +168,20 @@ public class GraphBackedTypeStore implements ITypeStore {
...
@@ -168,16 +168,20 @@ public class GraphBackedTypeStore implements ITypeStore {
switch
(
attrDataType
.
getTypeCategory
())
{
switch
(
attrDataType
.
getTypeCategory
())
{
case
ARRAY:
case
ARRAY:
String
attrType
=
TypeUtils
.
parseAsArrayType
(
attrDataType
.
getName
());
String
attrType
=
TypeUtils
.
parseAsArrayType
(
attrDataType
.
getName
());
IDataType
elementType
=
typeSystem
.
getDataType
(
IDataType
.
class
,
attrType
);
if
(
attrType
!=
null
)
{
attrDataTypes
.
add
(
elementType
);
IDataType
elementType
=
typeSystem
.
getDataType
(
IDataType
.
class
,
attrType
);
attrDataTypes
.
add
(
elementType
);
}
break
;
break
;
case
MAP:
case
MAP:
String
[]
attrTypes
=
TypeUtils
.
parseAsMapType
(
attrDataType
.
getName
());
String
[]
attrTypes
=
TypeUtils
.
parseAsMapType
(
attrDataType
.
getName
());
IDataType
keyType
=
typeSystem
.
getDataType
(
IDataType
.
class
,
attrTypes
[
0
]);
if
(
attrTypes
!=
null
&&
attrTypes
.
length
>
1
)
{
IDataType
valueType
=
typeSystem
.
getDataType
(
IDataType
.
class
,
attrTypes
[
1
]);
IDataType
keyType
=
typeSystem
.
getDataType
(
IDataType
.
class
,
attrTypes
[
0
]);
attrDataTypes
.
add
(
keyType
);
IDataType
valueType
=
typeSystem
.
getDataType
(
IDataType
.
class
,
attrTypes
[
1
]);
attrDataTypes
.
add
(
valueType
);
attrDataTypes
.
add
(
keyType
);
attrDataTypes
.
add
(
valueType
);
}
break
;
break
;
case
ENUM:
case
ENUM:
...
...
repository/src/main/java/org/apache/atlas/services/DefaultMetadataService.java
View file @
e15629c2
This diff is collapsed.
Click to expand it.
titan/src/main/java/com/thinkaurelius/titan/diskstorage/hbase/HBaseKeyColumnValueStore.java
View file @
e15629c2
...
@@ -330,18 +330,19 @@ public class HBaseKeyColumnValueStore implements KeyColumnValueStore {
...
@@ -330,18 +330,19 @@ public class HBaseKeyColumnValueStore implements KeyColumnValueStore {
ensureOpen
();
ensureOpen
();
return
new
RecordIterator
<
Entry
>()
{
return
new
RecordIterator
<
Entry
>()
{
private
final
Iterator
<
Map
.
Entry
<
byte
[],
NavigableMap
<
Long
,
byte
[]>>>
kv
=
currentRow
.
getMap
().
get
(
columnFamilyBytes
).
entrySet
().
iterator
();
private
final
NavigableMap
<
byte
[],
NavigableMap
<
byte
[],
NavigableMap
<
Long
,
byte
[]>>>
currentMap
=
currentRow
.
getMap
();
private
final
Iterator
<
Map
.
Entry
<
byte
[],
NavigableMap
<
Long
,
byte
[]>>>
kv
=
currentMap
==
null
?
null
:
currentMap
.
get
(
columnFamilyBytes
).
entrySet
().
iterator
();
@Override
@Override
public
boolean
hasNext
()
{
public
boolean
hasNext
()
{
ensureOpen
();
ensureOpen
();
return
kv
.
hasNext
();
return
kv
==
null
?
false
:
kv
.
hasNext
();
}
}
@Override
@Override
public
Entry
next
()
{
public
Entry
next
()
{
ensureOpen
();
ensureOpen
();
return
StaticArrayEntry
.
ofBytes
(
kv
.
next
(),
entryGetter
);
return
kv
==
null
?
null
:
StaticArrayEntry
.
ofBytes
(
kv
.
next
(),
entryGetter
);
}
}
@Override
@Override
...
...
typesystem/src/main/java/org/apache/atlas/typesystem/persistence/Id.java
View file @
e15629c2
...
@@ -45,9 +45,9 @@ public class Id implements ITypedReferenceableInstance {
...
@@ -45,9 +45,9 @@ public class Id implements ITypedReferenceableInstance {
public
EntityState
state
;
public
EntityState
state
;
public
Id
(
String
id
,
int
version
,
String
typeName
,
String
state
)
{
public
Id
(
String
id
,
int
version
,
String
typeName
,
String
state
)
{
ParamChecker
.
notEmpty
(
id
,
"id"
);
id
=
ParamChecker
.
notEmpty
(
id
,
"id"
);
ParamChecker
.
notEmpty
(
typeName
,
"typeName"
);
typeName
=
ParamChecker
.
notEmpty
(
typeName
,
"typeName"
);
ParamChecker
.
notEmptyIfNotNull
(
state
,
"state"
);
state
=
ParamChecker
.
notEmptyIfNotNull
(
state
,
"state"
);
this
.
id
=
id
;
this
.
id
=
id
;
this
.
typeName
=
typeName
;
this
.
typeName
=
typeName
;
this
.
version
=
version
;
this
.
version
=
version
;
...
...
typesystem/src/main/java/org/apache/atlas/typesystem/types/AbstractDataType.java
View file @
e15629c2
...
@@ -47,12 +47,18 @@ abstract class AbstractDataType<T> implements IDataType<T> {
...
@@ -47,12 +47,18 @@ abstract class AbstractDataType<T> implements IDataType<T> {
@Override
@Override
public
void
output
(
T
val
,
Appendable
buf
,
String
prefix
,
Set
<
T
>
inProcess
)
throws
AtlasException
{
public
void
output
(
T
val
,
Appendable
buf
,
String
prefix
,
Set
<
T
>
inProcess
)
throws
AtlasException
{
if
(
val
instanceof
Map
)
{
final
String
strValue
;
if
(
val
==
null
)
{
strValue
=
"<null>"
;
}
else
if
(
val
instanceof
Map
)
{
ImmutableSortedMap
immutableSortedMap
=
ImmutableSortedMap
.
copyOf
((
Map
)
val
);
ImmutableSortedMap
immutableSortedMap
=
ImmutableSortedMap
.
copyOf
((
Map
)
val
);
TypeUtils
.
outputVal
(
val
==
null
?
"<null>"
:
immutableSortedMap
.
toString
(),
buf
,
prefix
);
strValue
=
immutableSortedMap
.
toString
(
);
}
else
{
}
else
{
TypeUtils
.
outputVal
(
val
==
null
?
"<null>"
:
val
.
toString
(),
buf
,
prefix
);
strValue
=
val
.
toString
(
);
}
}
TypeUtils
.
outputVal
(
strValue
,
buf
,
prefix
);
}
}
@Override
@Override
...
...
typesystem/src/main/java/org/apache/atlas/typesystem/types/HierarchicalType.java
View file @
e15629c2
...
@@ -517,17 +517,20 @@ public abstract class HierarchicalType<ST extends HierarchicalType, T> extends A
...
@@ -517,17 +517,20 @@ public abstract class HierarchicalType<ST extends HierarchicalType, T> extends A
@Override
@Override
public
Path
next
()
{
public
Path
next
()
{
Path
p
=
pathQueue
.
poll
();
Path
p
=
pathQueue
.
poll
();
ST
t
=
null
;
try
{
if
(
p
!=
null
)
{
t
=
(
ST
)
typeSystem
.
getDataType
(
superTypeClass
,
p
.
typeName
);
ST
t
=
null
;
}
catch
(
AtlasException
me
)
{
try
{
throw
new
RuntimeException
(
me
);
t
=
(
ST
)
typeSystem
.
getDataType
(
superTypeClass
,
p
.
typeName
);
}
}
catch
(
AtlasException
me
)
{
if
(
t
.
superTypes
!=
null
)
{
throw
new
RuntimeException
(
me
);
ImmutableSet
<
String
>
sTs
=
t
.
superTypes
;
}
for
(
String
sT
:
sTs
)
{
if
(
t
.
superTypes
!=
null
)
{
String
nm
=
sT
+
"."
+
p
.
pathName
;
ImmutableSet
<
String
>
sTs
=
t
.
superTypes
;
pathQueue
.
add
(
pathNameToPathMap
.
get
(
nm
));
for
(
String
sT
:
sTs
)
{
String
nm
=
sT
+
"."
+
p
.
pathName
;
pathQueue
.
add
(
pathNameToPathMap
.
get
(
nm
));
}
}
}
}
}
return
p
;
return
p
;
...
...
typesystem/src/main/java/org/apache/atlas/typesystem/types/ObjectGraphTraversal.java
View file @
e15629c2
...
@@ -172,7 +172,9 @@ public class ObjectGraphTraversal implements Iterator<ObjectGraphTraversal.Insta
...
@@ -172,7 +172,9 @@ public class ObjectGraphTraversal implements Iterator<ObjectGraphTraversal.Insta
public
InstanceTuple
next
()
{
public
InstanceTuple
next
()
{
try
{
try
{
InstanceTuple
t
=
queue
.
poll
();
InstanceTuple
t
=
queue
.
poll
();
processReferenceableInstance
(
t
.
instance
);
if
(
t
!=
null
)
{
processReferenceableInstance
(
t
.
instance
);
}
return
t
;
return
t
;
}
catch
(
AtlasException
me
)
{
}
catch
(
AtlasException
me
)
{
throw
new
RuntimeException
(
me
);
throw
new
RuntimeException
(
me
);
...
...
typesystem/src/main/java/org/apache/atlas/typesystem/types/ObjectGraphWalker.java
View file @
e15629c2
...
@@ -76,7 +76,9 @@ public class ObjectGraphWalker {
...
@@ -76,7 +76,9 @@ public class ObjectGraphWalker {
public
void
walk
()
throws
AtlasException
{
public
void
walk
()
throws
AtlasException
{
while
(!
queue
.
isEmpty
())
{
while
(!
queue
.
isEmpty
())
{
IReferenceableInstance
r
=
queue
.
poll
();
IReferenceableInstance
r
=
queue
.
poll
();
processReferenceableInstance
(
r
);
if
(
r
!=
null
)
{
processReferenceableInstance
(
r
);
}
}
}
}
}
...
...
webapp/src/main/java/org/apache/atlas/Atlas.java
View file @
e15629c2
...
@@ -147,9 +147,12 @@ public final class Atlas {
...
@@ -147,9 +147,12 @@ public final class Atlas {
}
}
static
int
getApplicationPort
(
CommandLine
cmd
,
String
enableTLSFlag
,
Configuration
configuration
)
{
static
int
getApplicationPort
(
CommandLine
cmd
,
String
enableTLSFlag
,
Configuration
configuration
)
{
String
optionValue
=
cmd
.
hasOption
(
APP_PORT
)
?
cmd
.
getOptionValue
(
APP_PORT
)
:
null
;
final
int
appPort
;
final
int
appPort
;
if
(
cmd
.
hasOption
(
APP_PORT
))
{
appPort
=
Integer
.
valueOf
(
cmd
.
getOptionValue
(
APP_PORT
));
if
(
StringUtils
.
isNotEmpty
(
optionValue
))
{
appPort
=
Integer
.
valueOf
(
optionValue
);
}
else
{
}
else
{
// default : atlas.enableTLS is true
// default : atlas.enableTLS is true
appPort
=
getPortValue
(
configuration
,
enableTLSFlag
);
appPort
=
getPortValue
(
configuration
,
enableTLSFlag
);
...
...
webapp/src/main/java/org/apache/atlas/examples/QuickStart.java
View file @
e15629c2
...
@@ -466,7 +466,7 @@ public class QuickStart {
...
@@ -466,7 +466,7 @@ public class QuickStart {
if
(
results
!=
null
)
{
if
(
results
!=
null
)
{
System
.
out
.
println
(
"query ["
+
dslQuery
+
"] returned ["
+
results
.
length
()
+
"] rows"
);
System
.
out
.
println
(
"query ["
+
dslQuery
+
"] returned ["
+
results
.
length
()
+
"] rows"
);
}
else
{
}
else
{
System
.
out
.
println
(
"query ["
+
dslQuery
+
"] failed, results:"
+
results
.
toString
()
);
System
.
out
.
println
(
"query ["
+
dslQuery
+
"] failed, results:"
+
results
);
}
}
}
}
}
}
...
...
webapp/src/main/java/org/apache/atlas/util/CredentialProviderUtility.java
View file @
e15629c2
...
@@ -74,26 +74,28 @@ public class CredentialProviderUtility {
...
@@ -74,26 +74,28 @@ public class CredentialProviderUtility {
// prompt for the provider name
// prompt for the provider name
CredentialProvider
provider
=
getCredentialProvider
(
textDevice
);
CredentialProvider
provider
=
getCredentialProvider
(
textDevice
);
char
[]
cred
;
if
(
provider
!=
null
)
{
for
(
String
key
:
KEYS
)
{
char
[]
cred
;
cred
=
getPassword
(
textDevice
,
key
);
for
(
String
key
:
KEYS
)
{
// create a credential entry and store it
cred
=
getPassword
(
textDevice
,
key
);
boolean
overwrite
=
true
;
// create a credential entry and store it
if
(
provider
.
getCredentialEntry
(
key
)
!=
null
)
{
boolean
overwrite
=
true
;
String
choice
=
textDevice
.
readLine
(
"Entry for %s already exists. Overwrite? (y/n) [y]:"
,
key
);
if
(
provider
.
getCredentialEntry
(
key
)
!=
null
)
{
overwrite
=
StringUtils
.
isEmpty
(
choice
)
||
choice
.
equalsIgnoreCase
(
"y"
);
String
choice
=
textDevice
.
readLine
(
"Entry for %s already exists. Overwrite? (y/n) [y]:"
,
key
);
if
(
overwrite
)
{
overwrite
=
StringUtils
.
isEmpty
(
choice
)
||
choice
.
equalsIgnoreCase
(
"y"
);
provider
.
deleteCredentialEntry
(
key
);
if
(
overwrite
)
{
provider
.
flush
();
provider
.
deleteCredentialEntry
(
key
);
provider
.
flush
();
provider
.
createCredentialEntry
(
key
,
cred
);
provider
.
flush
();
textDevice
.
printf
(
"Entry for %s was overwritten with the new value.\n"
,
key
);
}
else
{
textDevice
.
printf
(
"Entry for %s was not overwritten.\n"
,
key
);
}
}
else
{
provider
.
createCredentialEntry
(
key
,
cred
);
provider
.
createCredentialEntry
(
key
,
cred
);
provider
.
flush
();
provider
.
flush
();
textDevice
.
printf
(
"Entry for %s was overwritten with the new value.\n"
,
key
);
}
else
{
textDevice
.
printf
(
"Entry for %s was not overwritten.\n"
,
key
);
}
}
}
else
{
provider
.
createCredentialEntry
(
key
,
cred
);
provider
.
flush
();
}
}
}
}
}
}
...
@@ -141,16 +143,21 @@ public class CredentialProviderUtility {
...
@@ -141,16 +143,21 @@ public class CredentialProviderUtility {
*/
*/
private
static
CredentialProvider
getCredentialProvider
(
TextDevice
textDevice
)
throws
IOException
{
private
static
CredentialProvider
getCredentialProvider
(
TextDevice
textDevice
)
throws
IOException
{
String
providerPath
=
textDevice
.
readLine
(
"Please enter the full path to the credential provider:"
);
String
providerPath
=
textDevice
.
readLine
(
"Please enter the full path to the credential provider:"
);
File
file
=
new
File
(
providerPath
);
if
(
file
.
exists
())
{
if
(
providerPath
!=
null
)
{
textDevice
File
file
=
new
File
(
providerPath
);
.
printf
(
"%s already exists. You will need to specify whether existing entries should be "
if
(
file
.
exists
())
{
+
"overwritten "
textDevice
+
"(default is 'yes')\n"
,
providerPath
);
.
printf
(
"%s already exists. You will need to specify whether existing entries should be "
+
"overwritten "
+
"(default is 'yes')\n"
,
providerPath
);
}
String
providerURI
=
JavaKeyStoreProvider
.
SCHEME_NAME
+
"://file/"
+
providerPath
;
Configuration
conf
=
new
Configuration
(
false
);
conf
.
set
(
CredentialProviderFactory
.
CREDENTIAL_PROVIDER_PATH
,
providerURI
);
return
CredentialProviderFactory
.
getProviders
(
conf
).
get
(
0
);
}
}
String
providerURI
=
JavaKeyStoreProvider
.
SCHEME_NAME
+
"://file/"
+
providerPath
;
Configuration
conf
=
new
Configuration
(
false
);
return
null
;
conf
.
set
(
CredentialProviderFactory
.
CREDENTIAL_PROVIDER_PATH
,
providerURI
);
return
CredentialProviderFactory
.
getProviders
(
conf
).
get
(
0
);
}
}
}
}
webapp/src/main/java/org/apache/atlas/web/dao/UserDao.java
View file @
e15629c2
...
@@ -19,6 +19,7 @@ package org.apache.atlas.web.dao;
...
@@ -19,6 +19,7 @@ package org.apache.atlas.web.dao;
import
com.google.common.annotations.VisibleForTesting
;
import
com.google.common.annotations.VisibleForTesting
;
import
java.io.FileInputStream
;
import
java.io.FileInputStream
;
import
java.io.InputStream
;
import
java.io.IOException
;
import
java.io.IOException
;
import
java.util.ArrayList
;
import
java.util.ArrayList
;
import
java.util.Properties
;
import
java.util.Properties
;
...
@@ -54,6 +55,8 @@ public class UserDao {
...
@@ -54,6 +55,8 @@ public class UserDao {
void
loadFileLoginsDetails
()
{
void
loadFileLoginsDetails
()
{
String
PROPERTY_FILE_PATH
=
null
;
String
PROPERTY_FILE_PATH
=
null
;
InputStream
inStr
=
null
;
try
{
try
{
Configuration
configuration
=
ApplicationProperties
.
get
();
Configuration
configuration
=
ApplicationProperties
.
get
();
...
@@ -61,7 +64,8 @@ public class UserDao {
...
@@ -61,7 +64,8 @@ public class UserDao {
.
getString
(
"atlas.authentication.method.file.filename"
);
.
getString
(
"atlas.authentication.method.file.filename"
);
if
(
PROPERTY_FILE_PATH
!=
null
&&
!
""
.
equals
(
PROPERTY_FILE_PATH
))
{
if
(
PROPERTY_FILE_PATH
!=
null
&&
!
""
.
equals
(
PROPERTY_FILE_PATH
))
{
userLogins
=
new
Properties
();
userLogins
=
new
Properties
();
userLogins
.
load
(
new
FileInputStream
(
PROPERTY_FILE_PATH
));
inStr
=
new
FileInputStream
(
PROPERTY_FILE_PATH
);
userLogins
.
load
(
inStr
);
}
else
{
}
else
{
LOG
.
error
(
"Error while reading user.properties file, filepath="
LOG
.
error
(
"Error while reading user.properties file, filepath="
+
PROPERTY_FILE_PATH
);
+
PROPERTY_FILE_PATH
);
...
@@ -70,6 +74,14 @@ public class UserDao {
...
@@ -70,6 +74,14 @@ public class UserDao {
}
catch
(
IOException
|
AtlasException
e
)
{
}
catch
(
IOException
|
AtlasException
e
)
{
LOG
.
error
(
"Error while reading user.properties file, filepath="
LOG
.
error
(
"Error while reading user.properties file, filepath="
+
PROPERTY_FILE_PATH
,
e
);
+
PROPERTY_FILE_PATH
,
e
);
}
finally
{
if
(
inStr
!=
null
)
{
try
{
inStr
.
close
();
}
catch
(
Exception
excp
)
{
// ignore
}
}
}
}
}
}
...
...
webapp/src/main/java/org/apache/atlas/web/resources/EntityResource.java
View file @
e15629c2
...
@@ -339,7 +339,7 @@ public class EntityResource {
...
@@ -339,7 +339,7 @@ public class EntityResource {
private
Response
updateEntityPartialByGuid
(
String
guid
,
HttpServletRequest
request
)
{
private
Response
updateEntityPartialByGuid
(
String
guid
,
HttpServletRequest
request
)
{
String
entityJson
=
null
;
String
entityJson
=
null
;
try
{
try
{
ParamChecker
.
notEmpty
(
guid
,
"Guid property cannot be null"
);
guid
=
ParamChecker
.
notEmpty
(
guid
,
"Guid property cannot be null"
);
entityJson
=
Servlets
.
getRequestPayload
(
request
);
entityJson
=
Servlets
.
getRequestPayload
(
request
);
LOG
.
info
(
"partially updating entity for guid {} : {} "
,
guid
,
entityJson
);
LOG
.
info
(
"partially updating entity for guid {} : {} "
,
guid
,
entityJson
);
...
@@ -468,7 +468,7 @@ public class EntityResource {
...
@@ -468,7 +468,7 @@ public class EntityResource {
}
}
LOG
.
debug
(
"Fetching entity definition for guid={} "
,
guid
);
LOG
.
debug
(
"Fetching entity definition for guid={} "
,
guid
);
ParamChecker
.
notEmpty
(
guid
,
"guid cannot be null"
);
guid
=
ParamChecker
.
notEmpty
(
guid
,
"guid cannot be null"
);
final
String
entityDefinition
=
metadataService
.
getEntityDefinition
(
guid
);
final
String
entityDefinition
=
metadataService
.
getEntityDefinition
(
guid
);
JSONObject
response
=
new
JSONObject
();
JSONObject
response
=
new
JSONObject
();
...
@@ -564,9 +564,9 @@ public class EntityResource {
...
@@ -564,9 +564,9 @@ public class EntityResource {
public
Response
getEntityDefinitionByAttribute
(
String
entityType
,
String
attribute
,
String
value
)
{
public
Response
getEntityDefinitionByAttribute
(
String
entityType
,
String
attribute
,
String
value
)
{
try
{
try
{
LOG
.
debug
(
"Fetching entity definition for type={}, qualified name={}"
,
entityType
,
value
);
LOG
.
debug
(
"Fetching entity definition for type={}, qualified name={}"
,
entityType
,
value
);
ParamChecker
.
notEmpty
(
entityType
,
"Entity type cannot be null"
);
entityType
=
ParamChecker
.
notEmpty
(
entityType
,
"Entity type cannot be null"
);
ParamChecker
.
notEmpty
(
attribute
,
"attribute name cannot be null"
);
attribute
=
ParamChecker
.
notEmpty
(
attribute
,
"attribute name cannot be null"
);
ParamChecker
.
notEmpty
(
value
,
"attribute value cannot be null"
);
value
=
ParamChecker
.
notEmpty
(
value
,
"attribute value cannot be null"
);
final
String
entityDefinition
=
metadataService
.
getEntityDefinition
(
entityType
,
attribute
,
value
);
final
String
entityDefinition
=
metadataService
.
getEntityDefinition
(
entityType
,
attribute
,
value
);
...
...
webapp/src/main/java/org/apache/atlas/web/resources/MetadataDiscoveryResource.java
View file @
e15629c2
...
@@ -91,7 +91,7 @@ public class MetadataDiscoveryResource {
...
@@ -91,7 +91,7 @@ public class MetadataDiscoveryResource {
@DefaultValue
(
LIMIT_OFFSET_DEFAULT
)
@QueryParam
(
"offset"
)
int
offset
)
{
@DefaultValue
(
LIMIT_OFFSET_DEFAULT
)
@QueryParam
(
"offset"
)
int
offset
)
{
AtlasPerfTracer
perf
=
null
;
AtlasPerfTracer
perf
=
null
;
if
(
AtlasPerfTracer
.
isPerfTraceEnabled
(
PERF_LOG
))
{
if
(
AtlasPerfTracer
.
isPerfTraceEnabled
(
PERF_LOG
))
{
perf
=
AtlasPerfTracer
.
getPerfTracer
(
PERF_LOG
,
"MetadataDiscoveryResource.search("
+
query
+
")"
);
perf
=
AtlasPerfTracer
.
getPerfTracer
(
PERF_LOG
,
"MetadataDiscoveryResource.search("
+
query
+
"
, "
+
limit
+
", "
+
offset
+
"
)"
);
}
}
Response
response
=
searchUsingQueryDSL
(
query
,
limit
,
offset
);
Response
response
=
searchUsingQueryDSL
(
query
,
limit
,
offset
);
if
(
response
.
getStatus
()
!=
Response
.
Status
.
OK
.
getStatusCode
())
{
if
(
response
.
getStatus
()
!=
Response
.
Status
.
OK
.
getStatusCode
())
{
...
@@ -123,10 +123,10 @@ public class MetadataDiscoveryResource {
...
@@ -123,10 +123,10 @@ public class MetadataDiscoveryResource {
AtlasPerfTracer
perf
=
null
;
AtlasPerfTracer
perf
=
null
;
try
{
try
{
if
(
AtlasPerfTracer
.
isPerfTraceEnabled
(
PERF_LOG
))
{
if
(
AtlasPerfTracer
.
isPerfTraceEnabled
(
PERF_LOG
))
{
perf
=
AtlasPerfTracer
.
getPerfTracer
(
PERF_LOG
,
"MetadataDiscoveryResource.searchUsingQueryDSL("
+
dslQuery
+
")"
);
perf
=
AtlasPerfTracer
.
getPerfTracer
(
PERF_LOG
,
"MetadataDiscoveryResource.searchUsingQueryDSL("
+
dslQuery
+
"
, "
+
limit
+
", "
+
offset
+
"
)"
);
}
}
ParamChecker
.
notEmpty
(
dslQuery
,
"dslQuery cannot be null"
);
dslQuery
=
ParamChecker
.
notEmpty
(
dslQuery
,
"dslQuery cannot be null"
);
QueryParams
queryParams
=
validateQueryParams
(
limit
,
offset
);
QueryParams
queryParams
=
validateQueryParams
(
limit
,
offset
);
final
String
jsonResultStr
=
discoveryService
.
searchByDSL
(
dslQuery
,
queryParams
);
final
String
jsonResultStr
=
discoveryService
.
searchByDSL
(
dslQuery
,
queryParams
);
...
@@ -184,7 +184,7 @@ public class MetadataDiscoveryResource {
...
@@ -184,7 +184,7 @@ public class MetadataDiscoveryResource {
perf
=
AtlasPerfTracer
.
getPerfTracer
(
PERF_LOG
,
"MetadataDiscoveryResource.searchUsingGremlinQuery("
+
gremlinQuery
+
")"
);
perf
=
AtlasPerfTracer
.
getPerfTracer
(
PERF_LOG
,
"MetadataDiscoveryResource.searchUsingGremlinQuery("
+
gremlinQuery
+
")"
);
}
}
ParamChecker
.
notEmpty
(
gremlinQuery
,
"gremlinQuery cannot be null or empty"
);
gremlinQuery
=
ParamChecker
.
notEmpty
(
gremlinQuery
,
"gremlinQuery cannot be null or empty"
);
final
List
<
Map
<
String
,
String
>>
results
=
discoveryService
.
searchByGremlin
(
gremlinQuery
);
final
List
<
Map
<
String
,
String
>>
results
=
discoveryService
.
searchByGremlin
(
gremlinQuery
);
JSONObject
response
=
new
JSONObject
();
JSONObject
response
=
new
JSONObject
();
...
@@ -230,10 +230,10 @@ public class MetadataDiscoveryResource {
...
@@ -230,10 +230,10 @@ public class MetadataDiscoveryResource {
AtlasPerfTracer
perf
=
null
;
AtlasPerfTracer
perf
=
null
;
try
{
try
{
if
(
AtlasPerfTracer
.
isPerfTraceEnabled
(
PERF_LOG
))
{
if
(
AtlasPerfTracer
.
isPerfTraceEnabled
(
PERF_LOG
))
{
perf
=
AtlasPerfTracer
.
getPerfTracer
(
PERF_LOG
,
"MetadataDiscoveryResource.searchUsingFullText("
+
query
+
")"
);
perf
=
AtlasPerfTracer
.
getPerfTracer
(
PERF_LOG
,
"MetadataDiscoveryResource.searchUsingFullText("
+
query
+
"
, "
+
limit
+
", "
+
offset
+
"
)"
);
}
}
ParamChecker
.
notEmpty
(
query
,
"query cannot be null or empty"
);
query
=
ParamChecker
.
notEmpty
(
query
,
"query cannot be null or empty"
);
QueryParams
queryParams
=
validateQueryParams
(
limit
,
offset
);
QueryParams
queryParams
=
validateQueryParams
(
limit
,
offset
);
final
String
jsonResultStr
=
discoveryService
.
searchByFullText
(
query
,
queryParams
);
final
String
jsonResultStr
=
discoveryService
.
searchByFullText
(
query
,
queryParams
);
JSONArray
rowsJsonArr
=
new
JSONArray
(
jsonResultStr
);
JSONArray
rowsJsonArr
=
new
JSONArray
(
jsonResultStr
);
...
...
webapp/src/main/java/org/apache/atlas/web/security/AtlasAuthenticationProvider.java
View file @
e15629c2
...
@@ -69,35 +69,32 @@ public class AtlasAuthenticationProvider extends
...
@@ -69,35 +69,32 @@ public class AtlasAuthenticationProvider extends
if
(
ldapType
.
equalsIgnoreCase
(
"LDAP"
))
{
if
(
ldapType
.
equalsIgnoreCase
(
"LDAP"
))
{
try
{
try
{
authentication
=
ldapAuthenticationProvider
authentication
=
ldapAuthenticationProvider
.
authenticate
(
authentication
);
.
authenticate
(
authentication
);
}
catch
(
Exception
ex
)
{
}
catch
(
Exception
ex
)
{
LOG
.
error
(
"Error while LDAP authentication"
,
ex
);
LOG
.
error
(
"Error while LDAP authentication"
,
ex
);
}
}
}
else
if
(
ldapType
.
equalsIgnoreCase
(
"AD"
))
{
}
else
if
(
ldapType
.
equalsIgnoreCase
(
"AD"
))
{
try
{
try
{
authentication
=
adAuthenticationProvider
authentication
=
adAuthenticationProvider
.
authenticate
(
authentication
);
.
authenticate
(
authentication
);
}
catch
(
Exception
ex
)
{
}
catch
(
Exception
ex
)
{
LOG
.
error
(
"Error while AD authentication"
,
ex
);
LOG
.
error
(
"Error while AD authentication"
,
ex
);
}
}
}
}
if
(
authentication
!=
null
&&
authentication
.
isAuthenticated
())
{
if
(
authentication
!=
null
)
{
return
authentication
;
if
(
authentication
.
isAuthenticated
())
{
}
else
{
// If the LDAP/AD authentication fails try the local filebased login method
if
(
fileAuthenticationMethodEnabled
)
{
authentication
=
fileAuthenticationProvider
.
authenticate
(
authentication
);
}
if
(
authentication
!=
null
&&
authentication
.
isAuthenticated
())
{
return
authentication
;
return
authentication
;
}
else
{
}
else
if
(
fileAuthenticationMethodEnabled
)
{
// If the LDAP/AD authentication fails try the local filebased login method
LOG
.
error
(
"Authentication failed."
);
authentication
=
fileAuthenticationProvider
.
authenticate
(
authentication
);
throw
new
AtlasAuthenticationException
(
"Authentication failed."
);
if
(
authentication
!=
null
&&
authentication
.
isAuthenticated
())
{
return
authentication
;
}
}
}
}
}
LOG
.
error
(
"Authentication failed."
);
throw
new
AtlasAuthenticationException
(
"Authentication failed."
);
}
}
}
}
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment