Skip to content
Projects
Groups
Snippets
Help
This project
Loading...
Sign in / Register
Toggle navigation
A
atlas
Project
Overview
Details
Activity
Cycle Analytics
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Charts
Issues
0
Issues
0
List
Board
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Charts
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Charts
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
dataplatform
atlas
Commits
f5c4dc4c
Commit
f5c4dc4c
authored
Jan 21, 2015
by
Venkatesh Seetharam
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
Code and scripts to add test data to server for UX. Contributed by Venkatesh Seetharam
parent
d3db08d2
Show whitespace changes
Inline
Side-by-side
Showing
3 changed files
with
336 additions
and
2 deletions
+336
-2
InstallationSteps.txt
InstallationSteps.txt
+12
-2
bootstrap-data.sh
src/bin/bootstrap-data.sh
+79
-0
TestDataDriver.java
.../main/java/org/apache/hadoop/metadata/TestDataDriver.java
+245
-0
No files found.
InstallationSteps.txt
View file @
f5c4dc4c
...
...
@@ -82,12 +82,22 @@ b. Starting Metadata Server
c. Using Falcon
~~~~~~~~~~~~~~~
* curl -v http://localhost:21000/api/metadata/admin/version
* Verify if the server is up and running
curl -v http://localhost:21000/api/metadata/admin/version
{"Version":"v0.1"}
* curl -v http://localhost:21000/api/metadata/types/list
* List the types in the repository
curl -v http://localhost:21000/api/metadata/types/list
{"list":["biginteger","short","byte","int","string","bigdecimal","boolean","date","double","long","float"],"requestId":"902580786@qtp-1479771328-0"}
* List the instances for a given type
curl -v http://localhost:21000/api/metadata/entities/list/hive_table
{"requestId":"788558007@qtp-44808654-5","list":["cb9b5513-c672-42cb-8477-b8f3e537a162","ec985719-a794-4c98-b98f-0509bd23aac0","48998f81-f1d3-45a2-989a-223af5c1ed6e","a54b386e-c759-4651-8779-a099294244c4"]}
curl -v http://localhost:21000/api/metadata/entities/list/hive_database
d. Stopping Falcon Server
~~~~~~~~~~~~~~~~~~~~~~~~~
...
...
src/bin/bootstrap-data.sh
0 → 100644
View file @
f5c4dc4c
#!/bin/bash
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License. See accompanying LICENSE file.
#
# resolve links - $0 may be a softlink
PRG
=
"
${
0
}
"
while
[
-h
"
${
PRG
}
"
]
;
do
ls
=
`
ls
-ld
"
${
PRG
}
"
`
link
=
`
expr
"
$ls
"
:
'.*-> \(.*\)$'
`
if
expr
"
$link
"
:
'/.*'
>
/dev/null
;
then
PRG
=
"
$link
"
else
PRG
=
`
dirname
"
${
PRG
}
"
`
/
"
$link
"
fi
done
BASEDIR
=
`
dirname
${
PRG
}
`
BASEDIR
=
`
cd
${
BASEDIR
}
/..
;
pwd
`
if
[
-z
"
$METADATA_CONF
"
]
;
then
METADATA_CONF
=
${
BASEDIR
}
/conf
fi
export
METADATA_CONF
if
[
-f
"
${
METADATA_CONF
}
/metadata-env.sh"
]
;
then
.
"
${
METADATA_CONF
}
/metadata-env.sh"
fi
if
test
-z
${
JAVA_HOME
}
then
JAVA_BIN
=
`
which java
`
JAR_BIN
=
`
which jar
`
else
JAVA_BIN
=
${
JAVA_HOME
}
/bin/java
JAR_BIN
=
${
JAVA_HOME
}
/bin/jar
fi
export
JAVA_BIN
if
[
!
-e
$JAVA_BIN
]
||
[
!
-e
$JAR_BIN
]
;
then
echo
"
$JAVA_BIN
and/or
$JAR_BIN
not found on the system. Please make sure java and jar commands are available."
exit
1
fi
# default the heap size to 1GB
DEFAULT_JAVA_HEAP_MAX
=
-Xmx1024m
METADATA_OPTS
=
"
$DEFAULT_JAVA_HEAP_MAX
$METADATA_OPTS
"
METADATACPPATH
=
"
$METADATA_CONF
"
METADATA_EXPANDED_WEBAPP_DIR
=
${
METADATA_EXPANDED_WEBAPP_DIR
:-${
BASEDIR
}
/server/webapp
}
export
METADATA_EXPANDED_WEBAPP_DIR
METADATACPPATH
=
"
${
METADATACPPATH
}
:
${
METADATA_EXPANDED_WEBAPP_DIR
}
/metadata/WEB-INF/classes"
METADATACPPATH
=
"
${
METADATACPPATH
}
:
${
METADATA_EXPANDED_WEBAPP_DIR
}
/metadata/WEB-INF/lib/*:
${
BASEDIR
}
/libext/*"
# log and pid dirs for applications
METADATA_LOG_DIR
=
"
${
METADATA_LOG_DIR
:-
$BASEDIR
/logs
}
"
export
METADATA_LOG_DIR
METADATA_HOME_DIR
=
"
${
METADATA_HOME_DIR
:-
$BASEDIR
}
"
export
METADATA_HOME_DIR
JAVA_PROPERTIES
=
"
$METADATA_OPTS
$METADATA_PROPERTIES
-Dmetadata.log.dir=
$METADATA_LOG_DIR
-Dmetadata.home=
${
METADATA_HOME_DIR
}
"
${
JAVA_BIN
}
${
JAVA_PROPERTIES
}
-cp
${
METADATACPPATH
}
org.apache.hadoop.metadata.TestDataDriver
echo
Test data added to Metadata Server!!!
webapp/src/main/java/org/apache/hadoop/metadata/TestDataDriver.java
0 → 100644
View file @
f5c4dc4c
/**
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package
org
.
apache
.
hadoop
.
metadata
;
import
com.google.common.collect.ImmutableList
;
import
com.sun.jersey.api.client.Client
;
import
com.sun.jersey.api.client.ClientResponse
;
import
com.sun.jersey.api.client.WebResource
;
import
com.sun.jersey.api.client.config.DefaultClientConfig
;
import
org.apache.hadoop.metadata.json.Serialization
$
;
import
org.apache.hadoop.metadata.json.TypesSerialization
;
import
org.apache.hadoop.metadata.types.AttributeDefinition
;
import
org.apache.hadoop.metadata.types.ClassType
;
import
org.apache.hadoop.metadata.types.DataTypes
;
import
org.apache.hadoop.metadata.types.HierarchicalTypeDefinition
;
import
org.apache.hadoop.metadata.types.IDataType
;
import
org.apache.hadoop.metadata.types.Multiplicity
;
import
org.apache.hadoop.metadata.types.StructTypeDefinition
;
import
org.apache.hadoop.metadata.types.TraitType
;
import
org.apache.hadoop.metadata.types.TypeSystem
;
import
org.codehaus.jettison.json.JSONArray
;
import
org.codehaus.jettison.json.JSONException
;
import
org.codehaus.jettison.json.JSONObject
;
import
org.slf4j.Logger
;
import
org.slf4j.LoggerFactory
;
import
javax.ws.rs.HttpMethod
;
import
javax.ws.rs.core.MediaType
;
import
javax.ws.rs.core.Response
;
import
javax.ws.rs.core.UriBuilder
;
import
java.util.Arrays
;
public
class
TestDataDriver
{
private
static
final
Logger
LOG
=
LoggerFactory
.
getLogger
(
TestDataDriver
.
class
);
private
static
final
String
DATABASE_TYPE
=
"hive_database"
;
private
static
final
String
TABLE_TYPE
=
"hive_table"
;
protected
TypeSystem
typeSystem
;
protected
WebResource
service
;
public
void
setUp
()
throws
Exception
{
typeSystem
=
TypeSystem
.
getInstance
();
typeSystem
.
reset
();
String
baseUrl
=
"http://localhost:21000/"
;
DefaultClientConfig
config
=
new
DefaultClientConfig
();
Client
client
=
Client
.
create
(
config
);
client
.
resource
(
UriBuilder
.
fromUri
(
baseUrl
).
build
());
service
=
client
.
resource
(
UriBuilder
.
fromUri
(
baseUrl
).
build
());
}
protected
AttributeDefinition
createRequiredAttrDef
(
String
name
,
IDataType
dataType
)
{
return
new
AttributeDefinition
(
name
,
dataType
.
getName
(),
Multiplicity
.
REQUIRED
,
false
,
null
);
}
@SuppressWarnings
(
"unchecked"
)
protected
HierarchicalTypeDefinition
<
TraitType
>
createTraitTypeDef
(
String
name
,
ImmutableList
<
String
>
superTypes
,
AttributeDefinition
...
attrDefs
)
{
return
new
HierarchicalTypeDefinition
(
TraitType
.
class
,
name
,
superTypes
,
attrDefs
);
}
@SuppressWarnings
(
"unchecked"
)
protected
HierarchicalTypeDefinition
<
ClassType
>
createClassTypeDef
(
String
name
,
ImmutableList
<
String
>
superTypes
,
AttributeDefinition
...
attrDefs
)
{
return
new
HierarchicalTypeDefinition
(
ClassType
.
class
,
name
,
superTypes
,
attrDefs
);
}
public
void
submitEntity
(
ITypedReferenceableInstance
tableInstance
)
throws
Exception
{
String
tableInstanceAsJSON
=
Serialization
$
.
MODULE
$
.
toJson
(
tableInstance
);
LOG
.
debug
(
"tableInstance = "
+
tableInstanceAsJSON
);
WebResource
resource
=
service
.
path
(
"api/metadata/entities/submit"
)
.
path
(
TABLE_TYPE
);
ClientResponse
clientResponse
=
resource
.
accept
(
MediaType
.
APPLICATION_JSON
)
.
type
(
MediaType
.
APPLICATION_JSON
)
.
method
(
HttpMethod
.
POST
,
ClientResponse
.
class
,
tableInstanceAsJSON
);
assert
clientResponse
.
getStatus
()
==
Response
.
Status
.
OK
.
getStatusCode
();
}
public
void
getEntityList
()
throws
Exception
{
ClientResponse
clientResponse
=
service
.
path
(
"api/metadata/entities/list/"
)
.
path
(
TABLE_TYPE
)
.
accept
(
MediaType
.
APPLICATION_JSON
)
.
type
(
MediaType
.
APPLICATION_JSON
)
.
method
(
HttpMethod
.
GET
,
ClientResponse
.
class
);
assert
clientResponse
.
getStatus
()
==
Response
.
Status
.
OK
.
getStatusCode
();
String
responseAsString
=
clientResponse
.
getEntity
(
String
.
class
);
JSONObject
response
=
new
JSONObject
(
responseAsString
);
final
JSONArray
list
=
response
.
getJSONArray
(
"list"
);
System
.
out
.
println
(
"list = "
+
list
);
assert
list
!=
null
;
assert
list
.
length
()
>
0
;
}
private
void
createHiveTypes
()
throws
Exception
{
HierarchicalTypeDefinition
<
ClassType
>
databaseTypeDefinition
=
createClassTypeDef
(
DATABASE_TYPE
,
ImmutableList
.<
String
>
of
(),
createRequiredAttrDef
(
"name"
,
DataTypes
.
STRING_TYPE
),
createRequiredAttrDef
(
"description"
,
DataTypes
.
STRING_TYPE
));
StructTypeDefinition
structTypeDefinition
=
new
StructTypeDefinition
(
"serdeType"
,
new
AttributeDefinition
[]
{
createRequiredAttrDef
(
"name"
,
DataTypes
.
STRING_TYPE
),
createRequiredAttrDef
(
"serde"
,
DataTypes
.
STRING_TYPE
)
});
HierarchicalTypeDefinition
<
ClassType
>
tableTypeDefinition
=
createClassTypeDef
(
TABLE_TYPE
,
ImmutableList
.<
String
>
of
(),
createRequiredAttrDef
(
"name"
,
DataTypes
.
STRING_TYPE
),
createRequiredAttrDef
(
"description"
,
DataTypes
.
STRING_TYPE
),
createRequiredAttrDef
(
"type"
,
DataTypes
.
STRING_TYPE
),
new
AttributeDefinition
(
"serde1"
,
"serdeType"
,
Multiplicity
.
REQUIRED
,
false
,
null
),
new
AttributeDefinition
(
"serde2"
,
"serdeType"
,
Multiplicity
.
REQUIRED
,
false
,
null
),
new
AttributeDefinition
(
"database"
,
DATABASE_TYPE
,
Multiplicity
.
REQUIRED
,
true
,
null
));
HierarchicalTypeDefinition
<
TraitType
>
classificationTypeDefinition
=
createTraitTypeDef
(
"classification"
,
ImmutableList
.<
String
>
of
(),
createRequiredAttrDef
(
"tag"
,
DataTypes
.
STRING_TYPE
));
typeSystem
.
defineTypes
(
ImmutableList
.
of
(
structTypeDefinition
),
ImmutableList
.
of
(
classificationTypeDefinition
),
ImmutableList
.
of
(
databaseTypeDefinition
,
tableTypeDefinition
));
}
private
void
submitTypes
()
throws
Exception
{
String
typesAsJSON
=
TypesSerialization
.
toJson
(
typeSystem
,
Arrays
.
asList
(
new
String
[]{
DATABASE_TYPE
,
TABLE_TYPE
,
"serdeType"
,
"classification"
}));
sumbitType
(
typesAsJSON
,
TABLE_TYPE
);
}
private
void
sumbitType
(
String
typesAsJSON
,
String
type
)
throws
JSONException
{
WebResource
resource
=
service
.
path
(
"api/metadata/types/submit"
)
.
path
(
type
);
ClientResponse
clientResponse
=
resource
.
accept
(
MediaType
.
APPLICATION_JSON
)
.
type
(
MediaType
.
APPLICATION_JSON
)
.
method
(
HttpMethod
.
POST
,
ClientResponse
.
class
,
typesAsJSON
);
assert
clientResponse
.
getStatus
()
==
Response
.
Status
.
OK
.
getStatusCode
();
String
responseAsString
=
clientResponse
.
getEntity
(
String
.
class
);
JSONObject
response
=
new
JSONObject
(
responseAsString
);
assert
response
.
get
(
"typeName"
).
equals
(
type
);
assert
response
.
get
(
"types"
)
!=
null
;
}
private
ITypedReferenceableInstance
createHiveTableInstance
(
String
db
,
String
table
,
String
trait
,
String
serde1
,
String
serde2
)
throws
Exception
{
Referenceable
databaseInstance
=
new
Referenceable
(
DATABASE_TYPE
);
databaseInstance
.
set
(
"name"
,
db
);
databaseInstance
.
set
(
"description"
,
db
+
" database"
);
Referenceable
tableInstance
=
new
Referenceable
(
TABLE_TYPE
,
"classification"
);
tableInstance
.
set
(
"name"
,
table
);
tableInstance
.
set
(
"description"
,
table
+
" table"
);
tableInstance
.
set
(
"type"
,
"managed"
);
tableInstance
.
set
(
"database"
,
databaseInstance
);
Struct
traitInstance
=
(
Struct
)
tableInstance
.
getTrait
(
"classification"
);
traitInstance
.
set
(
"tag"
,
trait
);
Struct
serde1Instance
=
new
Struct
(
"serdeType"
);
serde1Instance
.
set
(
"name"
,
serde1
);
serde1Instance
.
set
(
"serde"
,
serde1
);
tableInstance
.
set
(
"serde1"
,
serde1Instance
);
Struct
serde2Instance
=
new
Struct
(
"serdeType"
);
serde2Instance
.
set
(
"name"
,
serde2
);
serde2Instance
.
set
(
"serde"
,
serde2
);
tableInstance
.
set
(
"serde2"
,
serde2Instance
);
ClassType
tableType
=
typeSystem
.
getDataType
(
ClassType
.
class
,
TABLE_TYPE
);
return
tableType
.
convert
(
tableInstance
,
Multiplicity
.
REQUIRED
);
}
public
static
void
main
(
String
[]
args
)
throws
Exception
{
TestDataDriver
driver
=
new
TestDataDriver
();
driver
.
setUp
();
driver
.
createHiveTypes
();
driver
.
submitTypes
();
String
[][]
data
=
getTestData
();
for
(
String
[]
row
:
data
)
{
ITypedReferenceableInstance
tableInstance
=
driver
.
createHiveTableInstance
(
row
[
0
],
row
[
1
],
row
[
2
],
row
[
3
],
row
[
4
]);
driver
.
submitEntity
(
tableInstance
);
}
driver
.
getEntityList
();
}
private
static
String
[][]
getTestData
()
{
return
new
String
[][]{
{
"sales_db"
,
"customer_fact"
,
"pii"
,
"serde1"
,
"serde2"
},
{
"sales_db"
,
"sales_dim"
,
"dim"
,
"serde1"
,
"serde2"
},
{
"sales_db"
,
"product_dim"
,
"dim"
,
"serde1"
,
"serde2"
},
{
"sales_db"
,
"time_dim"
,
"dim"
,
"serde1"
,
"serde2"
},
{
"reporting_db"
,
"weekly_sales_summary"
,
"summary"
,
"serde1"
,
"serde2"
},
{
"reporting_db"
,
"daily_sales_summary"
,
"summary"
,
"serde1"
,
"serde2"
},
{
"reporting_db"
,
"monthly_sales_summary"
,
"summary"
,
"serde1"
,
"serde2"
},
{
"reporting_db"
,
"quarterly_sales_summary"
,
"summary"
,
"serde1"
,
"serde2"
},
{
"reporting_db"
,
"yearly_sales_summary"
,
"summary"
,
"serde1"
,
"serde2"
},
};
}
}
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment