Skip to content
Projects
Groups
Snippets
Help
This project
Loading...
Sign in / Register
Toggle navigation
A
atlas
Project
Overview
Details
Activity
Cycle Analytics
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Charts
Issues
0
Issues
0
List
Board
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Charts
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Charts
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
dataplatform
atlas
Commits
16107915
Commit
16107915
authored
Mar 09, 2016
by
Suma Shivaprasad
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
ATLAS-537 Falcon hook failing when tried to submit a process which creates a…
ATLAS-537 Falcon hook failing when tried to submit a process which creates a hive table. ( shwethgs via sumasai)
parent
0defc6e8
Hide whitespace changes
Inline
Side-by-side
Showing
4 changed files
with
113 additions
and
20 deletions
+113
-20
FalconHook.java
...rc/main/java/org/apache/atlas/falcon/hook/FalconHook.java
+8
-4
FalconHookIT.java
.../test/java/org/apache/atlas/falcon/hook/FalconHookIT.java
+65
-16
feed-hdfs.xml
addons/falcon-bridge/src/test/resources/feed-hdfs.xml
+39
-0
release-log.txt
release-log.txt
+1
-0
No files found.
addons/falcon-bridge/src/main/java/org/apache/atlas/falcon/hook/FalconHook.java
View file @
16107915
...
@@ -235,8 +235,10 @@ public class FalconHook extends FalconEventPublisher {
...
@@ -235,8 +235,10 @@ public class FalconHook extends FalconEventPublisher {
if
(
process
.
getInputs
()
!=
null
)
{
if
(
process
.
getInputs
()
!=
null
)
{
for
(
Input
input
:
process
.
getInputs
().
getInputs
())
{
for
(
Input
input
:
process
.
getInputs
().
getInputs
())
{
List
<
Referenceable
>
clusterInputs
=
getInputOutputEntity
(
cluster
,
input
.
getFeed
());
List
<
Referenceable
>
clusterInputs
=
getInputOutputEntity
(
cluster
,
input
.
getFeed
());
entities
.
addAll
(
clusterInputs
);
if
(
clusterInputs
!=
null
)
{
inputs
.
add
(
clusterInputs
.
get
(
clusterInputs
.
size
()
-
1
));
entities
.
addAll
(
clusterInputs
);
inputs
.
add
(
clusterInputs
.
get
(
clusterInputs
.
size
()
-
1
));
}
}
}
}
}
...
@@ -244,8 +246,10 @@ public class FalconHook extends FalconEventPublisher {
...
@@ -244,8 +246,10 @@ public class FalconHook extends FalconEventPublisher {
if
(
process
.
getOutputs
()
!=
null
)
{
if
(
process
.
getOutputs
()
!=
null
)
{
for
(
Output
output
:
process
.
getOutputs
().
getOutputs
())
{
for
(
Output
output
:
process
.
getOutputs
().
getOutputs
())
{
List
<
Referenceable
>
clusterOutputs
=
getInputOutputEntity
(
cluster
,
output
.
getFeed
());
List
<
Referenceable
>
clusterOutputs
=
getInputOutputEntity
(
cluster
,
output
.
getFeed
());
entities
.
addAll
(
clusterOutputs
);
if
(
clusterOutputs
!=
null
)
{
outputs
.
add
(
clusterOutputs
.
get
(
clusterOutputs
.
size
()
-
1
));
entities
.
addAll
(
clusterOutputs
);
outputs
.
add
(
clusterOutputs
.
get
(
clusterOutputs
.
size
()
-
1
));
}
}
}
}
}
...
...
addons/falcon-bridge/src/test/java/org/apache/atlas/falcon/hook/FalconHookIT.java
View file @
16107915
...
@@ -43,12 +43,15 @@ import javax.xml.bind.JAXBException;
...
@@ -43,12 +43,15 @@ import javax.xml.bind.JAXBException;
import
java.util.List
;
import
java.util.List
;
import
static
org
.
testng
.
Assert
.
assertEquals
;
import
static
org
.
testng
.
Assert
.
assertEquals
;
import
static
org
.
testng
.
Assert
.
assertNotNull
;
import
static
org
.
testng
.
Assert
.
assertNull
;
public
class
FalconHookIT
{
public
class
FalconHookIT
{
public
static
final
Logger
LOG
=
org
.
slf4j
.
LoggerFactory
.
getLogger
(
FalconHookIT
.
class
);
public
static
final
Logger
LOG
=
org
.
slf4j
.
LoggerFactory
.
getLogger
(
FalconHookIT
.
class
);
public
static
final
String
CLUSTER_RESOURCE
=
"/cluster.xml"
;
public
static
final
String
CLUSTER_RESOURCE
=
"/cluster.xml"
;
public
static
final
String
FEED_RESOURCE
=
"/feed.xml"
;
public
static
final
String
FEED_RESOURCE
=
"/feed.xml"
;
public
static
final
String
FEED_HDFS_RESOURCE
=
"/feed-hdfs.xml"
;
public
static
final
String
PROCESS_RESOURCE
=
"/process.xml"
;
public
static
final
String
PROCESS_RESOURCE
=
"/process.xml"
;
private
AtlasClient
dgiCLient
;
private
AtlasClient
dgiCLient
;
...
@@ -96,21 +99,13 @@ public class FalconHookIT {
...
@@ -96,21 +99,13 @@ public class FalconHookIT {
Cluster
cluster
=
loadEntity
(
EntityType
.
CLUSTER
,
CLUSTER_RESOURCE
,
"cluster"
+
random
());
Cluster
cluster
=
loadEntity
(
EntityType
.
CLUSTER
,
CLUSTER_RESOURCE
,
"cluster"
+
random
());
STORE
.
publish
(
EntityType
.
CLUSTER
,
cluster
);
STORE
.
publish
(
EntityType
.
CLUSTER
,
cluster
);
Feed
infeed
=
loadEntity
(
EntityType
.
FEED
,
FEED_RESOURCE
,
"feedin"
+
random
());
Feed
infeed
=
getTableFeed
(
FEED_RESOURCE
,
cluster
.
getName
());
org
.
apache
.
falcon
.
entity
.
v0
.
feed
.
Cluster
feedCluster
=
infeed
.
getClusters
().
getClusters
().
get
(
0
);
String
inTableName
=
getTableName
(
infeed
);
feedCluster
.
setName
(
cluster
.
getName
());
String
inDbName
=
getDBName
(
infeed
);
String
inTableName
=
"table"
+
random
();
String
inDbName
=
"db"
+
random
();
feedCluster
.
getTable
().
setUri
(
getTableUri
(
inDbName
,
inTableName
));
STORE
.
publish
(
EntityType
.
FEED
,
infeed
);
Feed
outfeed
=
loadEntity
(
EntityType
.
FEED
,
FEED_RESOURCE
,
"feedout"
+
random
());
Feed
outfeed
=
getTableFeed
(
FEED_RESOURCE
,
cluster
.
getName
());
feedCluster
=
outfeed
.
getClusters
().
getClusters
().
get
(
0
);
String
outTableName
=
getTableName
(
outfeed
);
feedCluster
.
setName
(
cluster
.
getName
());
String
outDbName
=
getDBName
(
outfeed
);
String
outTableName
=
"table"
+
random
();
String
outDbName
=
"db"
+
random
();
feedCluster
.
getTable
().
setUri
(
getTableUri
(
outDbName
,
outTableName
));
STORE
.
publish
(
EntityType
.
FEED
,
outfeed
);
Process
process
=
loadEntity
(
EntityType
.
PROCESS
,
PROCESS_RESOURCE
,
"process"
+
random
());
Process
process
=
loadEntity
(
EntityType
.
PROCESS
,
PROCESS_RESOURCE
,
"process"
+
random
());
process
.
getClusters
().
getClusters
().
get
(
0
).
setName
(
cluster
.
getName
());
process
.
getClusters
().
getClusters
().
get
(
0
).
setName
(
cluster
.
getName
());
...
@@ -120,6 +115,7 @@ public class FalconHookIT {
...
@@ -120,6 +115,7 @@ public class FalconHookIT {
String
pid
=
assertProcessIsRegistered
(
cluster
.
getName
(),
process
.
getName
());
String
pid
=
assertProcessIsRegistered
(
cluster
.
getName
(),
process
.
getName
());
Referenceable
processEntity
=
dgiCLient
.
getEntity
(
pid
);
Referenceable
processEntity
=
dgiCLient
.
getEntity
(
pid
);
assertNotNull
(
processEntity
);
assertEquals
(
processEntity
.
get
(
"processName"
),
process
.
getName
());
assertEquals
(
processEntity
.
get
(
"processName"
),
process
.
getName
());
Id
inId
=
(
Id
)
((
List
)
processEntity
.
get
(
"inputs"
)).
get
(
0
);
Id
inId
=
(
Id
)
((
List
)
processEntity
.
get
(
"inputs"
)).
get
(
0
);
...
@@ -133,7 +129,60 @@ public class FalconHookIT {
...
@@ -133,7 +129,60 @@ public class FalconHookIT {
HiveMetaStoreBridge
.
getTableQualifiedName
(
cluster
.
getName
(),
outDbName
,
outTableName
));
HiveMetaStoreBridge
.
getTableQualifiedName
(
cluster
.
getName
(),
outDbName
,
outTableName
));
}
}
// @Test (enabled = true, dependsOnMethods = "testCreateProcess")
private
Feed
getTableFeed
(
String
feedResource
,
String
clusterName
)
throws
Exception
{
Feed
feed
=
loadEntity
(
EntityType
.
FEED
,
feedResource
,
"feed"
+
random
());
org
.
apache
.
falcon
.
entity
.
v0
.
feed
.
Cluster
feedCluster
=
feed
.
getClusters
().
getClusters
().
get
(
0
);
feedCluster
.
setName
(
clusterName
);
feedCluster
.
getTable
().
setUri
(
getTableUri
(
"db"
+
random
(),
"table"
+
random
()));
STORE
.
publish
(
EntityType
.
FEED
,
feed
);
return
feed
;
}
private
String
getDBName
(
Feed
feed
)
{
String
uri
=
feed
.
getClusters
().
getClusters
().
get
(
0
).
getTable
().
getUri
();
String
[]
parts
=
uri
.
split
(
":"
);
return
parts
[
1
];
}
private
String
getTableName
(
Feed
feed
)
{
String
uri
=
feed
.
getClusters
().
getClusters
().
get
(
0
).
getTable
().
getUri
();
String
[]
parts
=
uri
.
split
(
":"
);
parts
=
parts
[
2
].
split
(
"#"
);
return
parts
[
0
];
}
@Test
(
enabled
=
true
)
public
void
testCreateProcessWithHDFSFeed
()
throws
Exception
{
Cluster
cluster
=
loadEntity
(
EntityType
.
CLUSTER
,
CLUSTER_RESOURCE
,
"cluster"
+
random
());
STORE
.
publish
(
EntityType
.
CLUSTER
,
cluster
);
Feed
infeed
=
loadEntity
(
EntityType
.
FEED
,
FEED_HDFS_RESOURCE
,
"feed"
+
random
());
org
.
apache
.
falcon
.
entity
.
v0
.
feed
.
Cluster
feedCluster
=
infeed
.
getClusters
().
getClusters
().
get
(
0
);
feedCluster
.
setName
(
cluster
.
getName
());
STORE
.
publish
(
EntityType
.
FEED
,
infeed
);
Feed
outfeed
=
getTableFeed
(
FEED_RESOURCE
,
cluster
.
getName
());
String
outTableName
=
getTableName
(
outfeed
);
String
outDbName
=
getDBName
(
outfeed
);
Process
process
=
loadEntity
(
EntityType
.
PROCESS
,
PROCESS_RESOURCE
,
"process"
+
random
());
process
.
getClusters
().
getClusters
().
get
(
0
).
setName
(
cluster
.
getName
());
process
.
getInputs
().
getInputs
().
get
(
0
).
setFeed
(
infeed
.
getName
());
process
.
getOutputs
().
getOutputs
().
get
(
0
).
setFeed
(
outfeed
.
getName
());
STORE
.
publish
(
EntityType
.
PROCESS
,
process
);
String
pid
=
assertProcessIsRegistered
(
cluster
.
getName
(),
process
.
getName
());
Referenceable
processEntity
=
dgiCLient
.
getEntity
(
pid
);
assertEquals
(
processEntity
.
get
(
"processName"
),
process
.
getName
());
assertNull
(
processEntity
.
get
(
"inputs"
));
Id
outId
=
(
Id
)
((
List
)
processEntity
.
get
(
"outputs"
)).
get
(
0
);
Referenceable
outEntity
=
dgiCLient
.
getEntity
(
outId
.
_getId
());
assertEquals
(
outEntity
.
get
(
"name"
),
HiveMetaStoreBridge
.
getTableQualifiedName
(
cluster
.
getName
(),
outDbName
,
outTableName
));
}
// @Test (enabled = true, dependsOnMethods = "testCreateProcess")
// public void testUpdateProcess() throws Exception {
// public void testUpdateProcess() throws Exception {
// FalconEvent event = createProcessEntity(PROCESS_NAME_2, INPUT, OUTPUT);
// FalconEvent event = createProcessEntity(PROCESS_NAME_2, INPUT, OUTPUT);
// FalconEventPublisher.Data data = new FalconEventPublisher.Data(event);
// FalconEventPublisher.Data data = new FalconEventPublisher.Data(event);
...
@@ -156,7 +205,7 @@ public class FalconHookIT {
...
@@ -156,7 +205,7 @@ public class FalconHookIT {
}
}
private
String
assertEntityIsRegistered
(
final
String
query
)
throws
Exception
{
private
String
assertEntityIsRegistered
(
final
String
query
)
throws
Exception
{
waitFor
(
20000
,
new
Predicate
()
{
waitFor
(
20000
00
,
new
Predicate
()
{
@Override
@Override
public
boolean
evaluate
()
throws
Exception
{
public
boolean
evaluate
()
throws
Exception
{
JSONArray
results
=
dgiCLient
.
search
(
query
);
JSONArray
results
=
dgiCLient
.
search
(
query
);
...
...
addons/falcon-bridge/src/test/resources/feed-hdfs.xml
0 → 100644
View file @
16107915
<?xml version="1.0" encoding="UTF-8"?>
<!--
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<feed
description=
"test input"
name=
"testinput"
xmlns=
"uri:falcon:feed:0.1"
>
<groups>
online,bi
</groups>
<frequency>
hours(1)
</frequency>
<timezone>
UTC
</timezone>
<late-arrival
cut-off=
"hours(3)"
/>
<clusters>
<cluster
name=
"testcluster"
type=
"source"
>
<validity
start=
"2010-01-01T00:00Z"
end=
"2012-04-21T00:00Z"
/>
<retention
limit=
"hours(24)"
action=
"delete"
/>
</cluster>
</clusters>
<locations>
<location
type=
"data"
path=
"/tmp/input/${YEAR}-${MONTH}-${DAY}-${HOUR}"
/>
</locations>
<ACL
owner=
"testuser"
group=
"group"
permission=
"0x755"
/>
<schema
location=
"hcat"
provider=
"hcat"
/>
</feed>
release-log.txt
View file @
16107915
...
@@ -11,6 +11,7 @@ ATLAS-409 Atlas will not import avro tables with schema read from a file (dosset
...
@@ -11,6 +11,7 @@ ATLAS-409 Atlas will not import avro tables with schema read from a file (dosset
ATLAS-379 Create sqoop and falcon metadata addons (venkatnrangan,bvellanki,sowmyaramesh via shwethags)
ATLAS-379 Create sqoop and falcon metadata addons (venkatnrangan,bvellanki,sowmyaramesh via shwethags)
ALL CHANGES:
ALL CHANGES:
ATLAS-537 Falcon hook failing when tried to submit a process which creates a hive table ( shwethags via sumasai)
ATLAS-476 Update type attribute with Reserved characters updated the original type as unknown (yhemanth via shwethags)
ATLAS-476 Update type attribute with Reserved characters updated the original type as unknown (yhemanth via shwethags)
ATLAS-463 Disconnect inverse references ( dkantor via sumasai)
ATLAS-463 Disconnect inverse references ( dkantor via sumasai)
ATLAS-479 Add description for different types during create time (guptaneeru via shwethags)
ATLAS-479 Add description for different types during create time (guptaneeru via shwethags)
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment