Commit 2fbe8d9b by Venkatesh Seetharam

Merge branch 'apache-local' into dal

Conflicts: repository/src/main/java/org/apache/atlas/services/DefaultMetadataService.java
parents f4579b6b 2febe216
......@@ -29,7 +29,7 @@ You would need the following installed:
1. Building Atlas
--------------------
Building DGI from the source repository
Building Atlas from the source repository
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
* git clone git@github.com:hortonworks/atlas.git atlas
......@@ -37,7 +37,7 @@ Building DGI from the source repository
* export MAVEN_OPTS="-Xmx1024m -XX:MaxPermSize=256m" && mvn clean install
2. Deploying DGI
2. Deploying Atlas
---------------------
Once the build successfully completes, artifacts can be packaged for deployment.
......@@ -65,21 +65,21 @@ Tar is structured as follows
|- DISCLAIMER.txt
|- CHANGES.txt
3. Installing & running DGI
3. Installing & running Atlas
--------------------------------
a. Installing DGI
a. Installing Atlas
~~~~~~~~~~~~~~~~~~~~~~
* tar -xzvf apache-atlas-${project.version}-bin.tar.gz
* cd atlas-${project.version}
b. Starting DGI Server
b. Starting Atlas Server
~~~~~~~~~~~~~~~~~~~~~~~~~
* bin/atlas-start.sh
c. Using DGI
c. Using Atlas
~~~~~~~~~~~~~~~
* Verify if the server is up and running
......@@ -99,7 +99,14 @@ c. Using DGI
* Search for entities (instances) in the repository
curl -v http://localhost:21000/api/atlas/discovery/search/dsl?query="from hive_table"
d. Stopping DGI Server
d. Using Atlas Dashboard
~~~~~~~~~~~~~~~~~~~~~~~~~
Navigate to http(s)://$host:$port/
Port by default is 21000
e. Stopping Atlas Server
~~~~~~~~~~~~~~~~~~~~~~~~~
* bin/atlas-stop.sh
......@@ -201,18 +201,190 @@
limitations under the License.
APACHE DGI SUBCOMPONENTS:
APACHE Atlas SUBCOMPONENTS:
The Apache DGI project contains subcomponents with separate copyright
The Apache Atlas project contains subcomponents with separate copyright
notices and license terms. Your use of the source code for the these
subcomponents is subject to the terms and conditions of the following
licenses.
================================
=======================================================================
Apache Atlas Subcomponents:
This product bundles d3 3.3.9, which is available under a
"3-clause BSD" license. For details, see docs/license/d3-LICENSE.txt
The Apache Atlas project contains subcomponents with separate copyright
notices and license terms. Your use of the source code for the these
subcomponents is subject to the terms and conditions of the following
licenses.
This product bundles jquery 1.11.0, which is available under a
MIT license. For details, see docs/license/jquery-LICENSE.txt
-----------------------------------------------------------------------
The MIT License
-----------------------------------------------------------------------
The Apache Atlas dashboard bundles the following files under the MIT License:
- AngularJS v1.2.28 (http://angularjs.org) - Copyright (c) 2010-2015 Google, Inc.
- angular-bootstrap v0.12.1 (https://github.com/angular-ui/bootstrap) - Copyright (c) 2012-2015 the AngularUI Team
- angular-ui-router v0.2.13 (https://github.com/angular-ui/ui-router) - Copyright (c) 2013-2015 The AngularUI Team, Karsten Sperling
- angular-ui-utils v0.1.1 (https://github.com/angular-ui/ui-utils) - Copyright (c) 2012 the AngularUI Team
- d3 v3.5.5 (https://github.com/mbostock/d3) - Copyright (c) 2013, Michael Bostock
- d3.tip v0.6.6 (https://github.com/mbostock/d3) - Copyright (c) 2013 Justin Palmer
- bootstrap v3.1.1 (http://getbootstrap.com) - Copyright (c) 2011-2014 Twitter, Inc
- lodash v3.0.0 (https://github.com/lodash/lodash) - Copyright 2012-2015 The Dojo Foundation <http://dojofoundation.org/>
Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas,
DocumentCloud and Investigative Reporters & Editors <http://underscorejs.org/>
- font-awesome css/less files v4.2.0 (http://fontawesome.io/) - Created by Dave Gandy
- jquery v2.1.4 (http://jquery.org) - Copyright 2005, 2014 jQuery Foundation and other contributors
All rights reserved.
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.
-----------------------------------------------------------------------
BSD-style Licenses
-----------------------------------------------------------------------
The Apache Atlas dashboard bundles the following files under BSD licenses:
(3-clause BSD license)
- D3 v3.5.5 (http://d3js.org/) - Copyright (c) 2010-2014, Michael Bostock
All rights reserved.
Redistribution and use in source and binary forms, with or without modification,
are permitted provided that the following conditions are met:
1. Redistributions of source code must retain the above copyright notice, this list
of conditions and the following disclaimer.
2. Redistributions in binary form must reproduce the above copyright notice, this
list of conditions and the following disclaimer in the documentation and/or
other materials provided with the distribution.
3. Neither the name of the copyright holder nor the names of its contributors may
be used to endorse or promote products derived from this software without
specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED.
IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT,
INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT
NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY,
WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
POSSIBILITY OF SUCH DAMAGE.
-----------------------------------------------------------------------
The Open Font License
-----------------------------------------------------------------------
The Apache Atlas dashboard bundles the following fonts under the
SIL Open Font License v1.1 (OFT) - http://scripts.sil.org/OFL
- font-awesome fonts v4.2.0 (http://fontawesome.io/) - Created by Dave Gandy
SIL OPEN FONT LICENSE
Version 1.1 - 26 February 2007
PREAMBLE
The goals of the Open Font License (OFL) are to stimulate worldwide
development of collaborative font projects, to support the font creation
efforts of academic and linguistic communities, and to provide a free and
open framework in which fonts may be shared and improved in partnership
with others.
The OFL allows the licensed fonts to be used, studied, modified and
redistributed freely as long as they are not sold by themselves. The
fonts, including any derivative works, can be bundled, embedded,
redistributed and/or sold with any software provided that any reserved
names are not used by derivative works. The fonts and derivatives,
however, cannot be released under any other type of license. The
requirement for fonts to remain under this license does not apply
to any document created using the fonts or their derivatives.
DEFINITIONS
"Font Software" refers to the set of files released by the Copyright
Holder(s) under this license and clearly marked as such. This may
include source files, build scripts and documentation.
"Reserved Font Name" refers to any names specified as such after the
copyright statement(s).
"Original Version" refers to the collection of Font Software components as
distributed by the Copyright Holder(s).
"Modified Version" refers to any derivative made by adding to, deleting,
or substituting — in part or in whole — any of the components of the
Original Version, by changing formats or by porting the Font Software to a
new environment.
"Author" refers to any designer, engineer, programmer, technical
writer or other person who contributed to the Font Software.
PERMISSION & CONDITIONS
Permission is hereby granted, free of charge, to any person obtaining
a copy of the Font Software, to use, study, copy, merge, embed, modify,
redistribute, and sell modified and unmodified copies of the Font
Software, subject to the following conditions:
1) Neither the Font Software nor any of its individual components,
in Original or Modified Versions, may be sold by itself.
2) Original or Modified Versions of the Font Software may be bundled,
redistributed and/or sold with any software, provided that each copy
contains the above copyright notice and this license. These can be
included either as stand-alone text files, human-readable headers or
in the appropriate machine-readable metadata fields within text or
binary files as long as those fields can be easily viewed by the user.
3) No Modified Version of the Font Software may use the Reserved Font
Name(s) unless explicit written permission is granted by the corresponding
Copyright Holder. This restriction only applies to the primary font name as
presented to the users.
4) The name(s) of the Copyright Holder(s) or the Author(s) of the Font
Software shall not be used to promote, endorse or advertise any
Modified Version, except to acknowledge the contribution(s) of the
Copyright Holder(s) and the Author(s) or with their explicit written
permission.
5) The Font Software, modified or unmodified, in part or in whole,
must be distributed entirely under this license, and must not be
distributed under any other license. The requirement for fonts to
remain under this license does not apply to any document created
using the Font Software.
TERMINATION
This license becomes null and void if any of the above conditions are
not met.
DISCLAIMER
THE FONT SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO ANY WARRANTIES OF
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT
OF COPYRIGHT, PATENT, TRADEMARK, OR OTHER RIGHT. IN NO EVENT SHALL THE
COPYRIGHT HOLDER BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY,
INCLUDING ANY GENERAL, SPECIAL, INDIRECT, INCIDENTAL, OR CONSEQUENTIAL
DAMAGES, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
FROM, OUT OF THE USE OR INABILITY TO USE THE FONT SOFTWARE OR FROM
OTHER DEALINGS IN THE FONT SOFTWARE.
Apache DGI (incubating)
Apache Atlas (incubating)
Copyright 2011-2014 The Apache Software Foundation
......
......@@ -16,7 +16,7 @@
Metadata and Governance Overview
The Data Governance Initiative, DGI, framework is an extensible set of core
The Data Governance Initiative, Atlas, framework is an extensible set of core
foundational governance services – enabling enterprises to effectively and
efficiently meet their compliance requirements within Hadoop and allows
integration with the whole enterprise data ecosystem.
......
<?xml version="1.0" encoding="UTF-8"?>
<!--
~ Licensed to the Apache Software Foundation (ASF) under one
~ or more contributor license agreements. See the NOTICE file
~ distributed with this work for additional information
~ regarding copyright ownership. The ASF licenses this file
~ to you under the Apache License, Version 2.0 (the
~ "License"); you may not use this file except in compliance
~ with the License. You may obtain a copy of the License at
~
~ http://www.apache.org/licenses/LICENSE-2.0
~
~ Unless required by applicable law or agreed to in writing, software
~ distributed under the License is distributed on an "AS IS" BASIS,
~ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
~ See the License for the specific language governing permissions and
~ limitations under the License.
-->
<project xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns="http://maven.apache.org/POM/4.0.0"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<artifactId>apache-atlas</artifactId>
<groupId>org.apache.atlas</groupId>
<version>0.1-incubating-SNAPSHOT</version>
<relativePath>../../</relativePath>
</parent>
<artifactId>falcon-bridge</artifactId>
<description>Apache Atlas Falcon Bridge Module</description>
<name>Apache Atlas Falcon Bridge</name>
<packaging>jar</packaging>
<properties>
<falcon.version>0.6.0.2.2.0.0-2041</falcon.version>
</properties>
<dependencies>
<dependency>
<groupId>org.apache.falcon</groupId>
<artifactId>falcon-client</artifactId>
<version>${falcon.version}</version>
</dependency>
<!-- falcon-client depends on jersey-client in provided scope. Hence explicit dependency -->
<dependency>
<groupId>com.sun.jersey</groupId>
<artifactId>jersey-client</artifactId>
</dependency>
<dependency>
<groupId>org.apache.atlas</groupId>
<artifactId>atlas-typesystem</artifactId>
</dependency>
<dependency>
<groupId>org.apache.atlas</groupId>
<artifactId>atlas-repository</artifactId>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
</dependency>
<dependency>
<groupId>org.testng</groupId>
<artifactId>testng</artifactId>
</dependency>
<dependency>
<groupId>org.mockito</groupId>
<artifactId>mockito-all</artifactId>
</dependency>
</dependencies>
</project>
/**
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.atlas.falcon;
import com.google.inject.Inject;
import org.apache.atlas.MetadataException;
import org.apache.atlas.repository.MetadataRepository;
import org.apache.atlas.typesystem.ITypedInstance;
import org.apache.atlas.typesystem.Referenceable;
import org.apache.atlas.typesystem.Struct;
import org.apache.atlas.typesystem.types.EnumType;
import org.apache.atlas.typesystem.types.Multiplicity;
import org.apache.atlas.typesystem.types.StructType;
import org.apache.atlas.typesystem.types.TraitType;
import org.apache.atlas.typesystem.types.TypeSystem;
import org.apache.commons.lang.StringUtils;
import org.apache.falcon.client.FalconCLIException;
import org.apache.falcon.client.FalconClient;
import org.apache.falcon.entity.v0.Entity;
import org.apache.falcon.entity.v0.EntityType;
import org.apache.falcon.entity.v0.cluster.Cluster;
import org.apache.falcon.entity.v0.cluster.Interface;
import org.apache.falcon.entity.v0.cluster.Location;
import org.apache.falcon.entity.v0.cluster.Properties;
import org.apache.falcon.entity.v0.cluster.Property;
import org.apache.falcon.resource.EntityList;
import javax.xml.bind.JAXBException;
import java.io.StringReader;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
public class FalconImporter {
private static final TypeSystem typeSystem = TypeSystem.getInstance();
private final FalconClient client;
private final MetadataRepository repository;
@Inject
public FalconImporter(FalconClient client, MetadataRepository repo) {
this.client = client;
this.repository = repo;
}
private Entity getEntity(FalconClient client, EntityType type, String name) throws FalconCLIException, JAXBException {
String entityStr = client.getDefinition(type.name(), name);
return (Entity) type.getUnmarshaller().unmarshal(new StringReader(entityStr));
}
public void importClusters() throws MetadataException {
try {
EntityList clusters = client.getEntityList(EntityType.CLUSTER.name(), null, null, null, null, null, null, null);
for (EntityList.EntityElement element : clusters.getElements()) {
Cluster cluster = (Cluster) getEntity(client, EntityType.CLUSTER, element.name);
Referenceable clusterRef = new Referenceable(FalconTypeSystem.DefinedTypes.CLUSTER.name());
clusterRef.set("name", cluster.getName());
if (cluster.getACL() != null) {
Struct acl = new Struct(FalconTypeSystem.DefinedTypes.ACL.name());
acl.set("owner", cluster.getACL().getOwner());
acl.set("group", cluster.getACL().getGroup());
acl.set("permission", cluster.getACL().getPermission());
StructType aclType = typeSystem.getDataType(StructType.class, FalconTypeSystem.DefinedTypes.ACL.name());
clusterRef.set("acl", aclType.convert(acl, Multiplicity.REQUIRED));
}
if (StringUtils.isNotEmpty(cluster.getTags())) {
String[] parts = cluster.getTags().split(",");
List<ITypedInstance> tags = new ArrayList<>();
for (String part : parts) {
TraitType tagType = typeSystem.getDataType(TraitType.class, FalconTypeSystem.DefinedTypes.TAG.name());
String[] kv = part.trim().split("=");
Struct tag = new Struct(FalconTypeSystem.DefinedTypes.TAG.name());
tag.set("name", kv[0]);
tag.set("value", kv[0]);
tags.add(tagType.convert(tag, Multiplicity.REQUIRED));
}
clusterRef.set("tags", tags);
}
if (cluster.getProperties() != null) {
clusterRef.set("properties", getMap(cluster.getProperties()));
}
if (cluster.getLocations() != null) {
List<ITypedInstance> locations = new ArrayList<>();
for (Location loc : cluster.getLocations().getLocations()) {
Struct location = new Struct(FalconTypeSystem.DefinedTypes.CLUSTER_LOCATION.name());
EnumType locationType = typeSystem.getDataType(EnumType.class, FalconTypeSystem.DefinedTypes.CLUSTER_LOCATION_TYPE.name());
location.set("type", locationType.fromValue(loc.getName().toUpperCase()));
location.set("path", loc.getPath());
StructType type = typeSystem.getDataType(StructType.class, FalconTypeSystem.DefinedTypes.CLUSTER_LOCATION.name());
locations.add(type.convert(location, Multiplicity.REQUIRED));
}
clusterRef.set("locations", locations);
}
if (cluster.getInterfaces() != null) {
List<ITypedInstance> interfaces = new ArrayList<>();
for (Interface interfaceFld : cluster.getInterfaces().getInterfaces()) {
Struct interfaceStruct = new Struct(FalconTypeSystem.DefinedTypes.CLUSTER_INTERFACE.name());
interfaceStruct.set("type", interfaceFld.getType().name());
interfaceStruct.set("endpoint", interfaceFld.getEndpoint());
interfaceStruct.set("version", interfaceFld.getVersion());
StructType type = typeSystem.getDataType(StructType.class, FalconTypeSystem.DefinedTypes.CLUSTER_INTERFACE.name());
interfaces.add(type.convert(interfaceStruct, Multiplicity.REQUIRED));
}
clusterRef.set("interfaces", interfaces);
}
repository.createEntity(clusterRef);
}
} catch (Exception e) {
throw new MetadataException(e);
}
}
private Map<String, String> getMap(Properties properties) {
Map<String, String> map = new HashMap();
for (Property property : properties.getProperties()) {
map.put(property.getName().trim(), property.getValue().trim());
}
return map;
}
}
/**
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.atlas.falcon;
import com.google.common.collect.ImmutableList;
import org.apache.atlas.MetadataException;
import org.apache.atlas.typesystem.types.AttributeDefinition;
import org.apache.atlas.typesystem.types.ClassType;
import org.apache.atlas.typesystem.types.DataTypes;
import org.apache.atlas.typesystem.types.EnumTypeDefinition;
import org.apache.atlas.typesystem.types.EnumValue;
import org.apache.atlas.typesystem.types.HierarchicalTypeDefinition;
import org.apache.atlas.typesystem.types.Multiplicity;
import org.apache.atlas.typesystem.types.StructTypeDefinition;
import org.apache.atlas.typesystem.types.TraitType;
import org.apache.atlas.typesystem.types.TypeSystem;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.util.ArrayList;
import java.util.List;
public class FalconTypeSystem {
public static final Logger LOG = LoggerFactory.getLogger(FalconTypeSystem.class);
public static final TypeSystem TYPE_SYSTEM = TypeSystem.getInstance();
private static FalconTypeSystem INSTANCE;
private List<StructTypeDefinition> structTypeDefinitions = new ArrayList<>();
private List<HierarchicalTypeDefinition<TraitType>> traitTypeDefinitions = new ArrayList<>();
private FalconTypeSystem() throws MetadataException {
HierarchicalTypeDefinition<ClassType> cluster = defineCluster();
//TODO define feed and process
TYPE_SYSTEM.defineTypes(ImmutableList.copyOf(structTypeDefinitions), ImmutableList.copyOf(traitTypeDefinitions),
ImmutableList.of(cluster));
}
public static FalconTypeSystem getInstance() throws MetadataException {
if (INSTANCE == null) {
synchronized(LOG) {
if (INSTANCE == null) {
INSTANCE = new FalconTypeSystem();
}
}
}
return INSTANCE;
}
private HierarchicalTypeDefinition<ClassType> defineCluster() throws MetadataException {
defineACL();
defineClusterInterface();
defineClusterLocation();
defineTags();
AttributeDefinition[] attributeDefinitions = new AttributeDefinition[]{
new AttributeDefinition("name", DataTypes.STRING_TYPE.getName(), Multiplicity.REQUIRED, false, null),
new AttributeDefinition("acl", DefinedTypes.ACL.name(), Multiplicity.OPTIONAL, false, null),
new AttributeDefinition("tags", DefinedTypes.TAG.name(), Multiplicity.COLLECTION, false, null),
new AttributeDefinition("locations", TYPE_SYSTEM.defineMapType(DataTypes.STRING_TYPE, DataTypes.STRING_TYPE).getName(), Multiplicity.COLLECTION, false, null),
new AttributeDefinition("interfaces", DefinedTypes.CLUSTER_INTERFACE.name(), Multiplicity.COLLECTION, false, null),
new AttributeDefinition("properties", TYPE_SYSTEM.defineMapType(DataTypes.STRING_TYPE, DataTypes.STRING_TYPE).getName(), Multiplicity.OPTIONAL, false, null),
};
HierarchicalTypeDefinition<ClassType> cluster =
new HierarchicalTypeDefinition<>(ClassType.class, DefinedTypes.CLUSTER.name(), ImmutableList.<String>of(), attributeDefinitions);
LOG.debug("Created definition for " + DefinedTypes.CLUSTER.name());
return cluster;
}
private HierarchicalTypeDefinition<TraitType> defineTags() {
AttributeDefinition[] attributeDefinitions = new AttributeDefinition[]{
new AttributeDefinition("name", DataTypes.STRING_TYPE.getName(), Multiplicity.REQUIRED, false, null),
new AttributeDefinition("value", DataTypes.STRING_TYPE.getName(), Multiplicity.REQUIRED, false, null)
};
HierarchicalTypeDefinition<TraitType> traitType = new HierarchicalTypeDefinition<>(TraitType.class, DefinedTypes.TAG.name(), ImmutableList.<String>of(), attributeDefinitions);
LOG.debug("Created definition for " + DefinedTypes.TAG.name());
traitTypeDefinitions.add(traitType);
return traitType;
}
private StructTypeDefinition defineClusterLocation() throws MetadataException {
EnumValue values[] = {
new EnumValue("WORKING", 1),
new EnumValue("STAGING", 2),
new EnumValue("TEMP", 3),
};
LOG.debug("Created definition for " + DefinedTypes.CLUSTER_LOCATION_TYPE.name());
EnumTypeDefinition locationType = new EnumTypeDefinition(DefinedTypes.CLUSTER_LOCATION_TYPE.name(), values);
TYPE_SYSTEM.defineEnumType(locationType);
AttributeDefinition[] attributeDefinitions = new AttributeDefinition[]{
new AttributeDefinition("type", DefinedTypes.CLUSTER_LOCATION_TYPE.name(), Multiplicity.REQUIRED, false, null),
new AttributeDefinition("path", DataTypes.STRING_TYPE.getName(), Multiplicity.REQUIRED, false, null),
};
LOG.debug("Created definition for " + DefinedTypes.CLUSTER_LOCATION.name());
StructTypeDefinition location = new StructTypeDefinition(DefinedTypes.CLUSTER_LOCATION.name(), attributeDefinitions);
structTypeDefinitions.add(location);
return location;
}
private StructTypeDefinition defineClusterInterface() throws MetadataException {
EnumValue values[] = {
new EnumValue("READONLY", 1),
new EnumValue("WRITE", 2),
new EnumValue("EXECUTE", 3),
new EnumValue("WORKFLOW", 4),
new EnumValue("MESSAGING", 5),
new EnumValue("REGISTRY", 6),
};
LOG.debug("Created definition for " + DefinedTypes.CLUSTER_INTERFACE_TYPE.name());
EnumTypeDefinition interfaceType = new EnumTypeDefinition(DefinedTypes.CLUSTER_INTERFACE_TYPE.name(), values);
TYPE_SYSTEM.defineEnumType(interfaceType);
AttributeDefinition[] attributeDefinitions = new AttributeDefinition[]{
new AttributeDefinition("type", DefinedTypes.CLUSTER_INTERFACE_TYPE.name(), Multiplicity.REQUIRED, false, null),
new AttributeDefinition("endpoint", DataTypes.STRING_TYPE.getName(), Multiplicity.REQUIRED, false, null),
new AttributeDefinition("version", DataTypes.STRING_TYPE.getName(), Multiplicity.REQUIRED, false, null),
};
LOG.debug("Created definition for " + DefinedTypes.CLUSTER_INTERFACE.name());
StructTypeDefinition interfaceEntity = new StructTypeDefinition(DefinedTypes.CLUSTER_INTERFACE.name(), attributeDefinitions);
structTypeDefinitions.add(interfaceEntity);
return interfaceEntity;
}
public static enum DefinedTypes {
ACL,
TAG,
CLUSTER,
CLUSTER_INTERFACE,
CLUSTER_INTERFACE_TYPE,
CLUSTER_LOCATION,
CLUSTER_LOCATION_TYPE
}
private StructTypeDefinition defineACL() {
AttributeDefinition[] attributeDefinitions = new AttributeDefinition[]{
new AttributeDefinition("owner", DataTypes.STRING_TYPE.getName(),
Multiplicity.REQUIRED, false, null),
new AttributeDefinition("group", DataTypes.STRING_TYPE.getName(),
Multiplicity.REQUIRED, false, null),
new AttributeDefinition("permission", DataTypes.STRING_TYPE.getName(),
Multiplicity.OPTIONAL, false, null),
};
LOG.debug("Created definition for " + DefinedTypes.ACL.name());
StructTypeDefinition acl = new StructTypeDefinition(DefinedTypes.ACL.name(), attributeDefinitions);
structTypeDefinitions.add(acl);
return acl;
}
}
/**
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.atlas.falcon;
import org.apache.atlas.repository.MetadataRepository;
import org.apache.atlas.typesystem.IReferenceableInstance;
import org.apache.commons.lang.RandomStringUtils;
import org.apache.falcon.client.FalconClient;
import org.apache.falcon.entity.v0.EntityType;
import org.apache.falcon.entity.v0.cluster.Cluster;
import org.apache.falcon.entity.v0.cluster.Interface;
import org.apache.falcon.entity.v0.cluster.Interfaces;
import org.apache.falcon.entity.v0.cluster.Interfacetype;
import org.apache.falcon.entity.v0.cluster.Location;
import org.apache.falcon.entity.v0.cluster.Locations;
import org.apache.falcon.resource.EntityList;
import org.testng.annotations.Test;
import java.io.StringWriter;
import java.util.UUID;
import static org.mockito.Matchers.any;
import static org.mockito.Matchers.anyString;
import static org.mockito.Mockito.mock;
import static org.mockito.Mockito.when;
public class FalconImporterTest {
@Test
public void testImport() throws Exception {
MetadataRepository repo = mock(MetadataRepository.class);
FalconClient client = mock(FalconClient.class);
FalconTypeSystem.getInstance();
FalconImporter importer = new FalconImporter(client, repo);
when(client.getEntityList(EntityType.CLUSTER.name(), null, null, null, null, null, null,
null)).thenReturn(getEntityList());
//TODO Set other fields in cluster
when(client.getDefinition(anyString(), anyString())).thenReturn(getCluster());
when(repo.createEntity(any(IReferenceableInstance.class), anyString())).thenReturn(UUID.randomUUID().toString());
importer.importClusters();
}
public EntityList getEntityList() {
EntityList.EntityElement[] entities = new EntityList.EntityElement[2];
entities[0] = new EntityList.EntityElement();
entities[0].name = "c1";
entities[1] = new EntityList.EntityElement();
entities[1].name = "c2";
return new EntityList(entities);
}
private Interface getInterface(Interfacetype type, String endpoint) {
Interface clusterInterface = new Interface();
clusterInterface.setEndpoint(endpoint);
clusterInterface.setType(type);
clusterInterface.setVersion("2.2");
return clusterInterface;
}
public String getCluster() throws Exception {
Cluster cluster = new Cluster();
cluster.setName(RandomStringUtils.randomAlphabetic(10));
cluster.setColo(RandomStringUtils.randomAlphabetic(5));
cluster.setTags("owner=xyz,team=abc");
Interfaces interfaces = new Interfaces();
Interface clusterInterface = new Interface();
clusterInterface.setEndpoint("hdfs://localhost:8030");
clusterInterface.setType(Interfacetype.WRITE);
clusterInterface.setVersion("2.2");
interfaces.getInterfaces().add(getInterface(Interfacetype.WRITE, "hdfs://localhost:8030"));
interfaces.getInterfaces().add(getInterface(Interfacetype.READONLY, "hdfs://localhost:8030"));
interfaces.getInterfaces().add(getInterface(Interfacetype.EXECUTE, "http://localhost:8040"));
cluster.setInterfaces(interfaces);
Locations locations = new Locations();
locations.getLocations().add(getLocation());
cluster.setLocations(locations);
StringWriter writer = new StringWriter();
EntityType.CLUSTER.getMarshaller().marshal(cluster, writer);
return writer.toString();
}
public Location getLocation() {
Location location = new Location();
location.setName("staging");
location.setPath("/staging");
return location;
}
}
/**
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
* <p/>
* http://www.apache.org/licenses/LICENSE-2.0
* <p/>
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.atlas.falcon;
import org.apache.atlas.MetadataException;
import org.apache.atlas.typesystem.types.ClassType;
import org.apache.atlas.typesystem.types.TraitType;
import org.apache.atlas.typesystem.types.TypeSystem;
import org.junit.Assert;
import org.testng.annotations.Test;
public class FalconTypeSystemTest {
@Test
public void testTypeSystem() throws MetadataException {
FalconTypeSystem.getInstance();
Assert.assertNotNull(TypeSystem.getInstance().getDataType(ClassType.class, FalconTypeSystem.DefinedTypes.CLUSTER.name()));
Assert.assertNotNull(TypeSystem.getInstance().getDataType(TraitType.class, FalconTypeSystem.DefinedTypes.TAG.name()));
}
}
......@@ -51,7 +51,7 @@ import java.util.Set;
/**
* A Bridge Utility that imports metadata from the Hive Meta Store
* and registers then in DGI.
* and registers then in Atlas.
*/
public class HiveMetaStoreBridge {
private static final String DEFAULT_DGI_URL = "http://localhost:21000/";
......@@ -159,9 +159,8 @@ public class HiveMetaStoreBridge {
LOG.debug("Getting reference for database {}", databaseName);
String typeName = HiveDataTypes.HIVE_DB.getName();
String dslQuery = String.format("%s where %s = '%s' and %s = '%s'", typeName,
HiveDataModelGenerator.NAME, databaseName.toLowerCase(), HiveDataModelGenerator.CLUSTER_NAME,
clusterName);
String dslQuery = String.format("%s where %s = '%s' and %s = '%s'", typeName, HiveDataModelGenerator.NAME,
databaseName.toLowerCase(), HiveDataModelGenerator.CLUSTER_NAME, clusterName);
return getEntityReferenceFromDSL(typeName, dslQuery);
}
......@@ -170,11 +169,12 @@ public class HiveMetaStoreBridge {
String typeName = HiveDataTypes.HIVE_PROCESS.getName();
//todo enable DSL
// String dslQuery = String.format("%s where queryText = \"%s\"", typeName, queryStr);
// return getEntityReferenceFromDSL(typeName, dslQuery);
// String dslQuery = String.format("%s where queryText = \"%s\"", typeName, queryStr);
// return getEntityReferenceFromDSL(typeName, dslQuery);
String gremlinQuery = String.format("g.V.has('__typeName', '%s').has('%s.queryText', \"%s\").toList()",
typeName, typeName, StringEscapeUtils.escapeJava(queryStr));
String gremlinQuery =
String.format("g.V.has('__typeName', '%s').has('%s.queryText', \"%s\").toList()", typeName, typeName,
StringEscapeUtils.escapeJava(queryStr));
return getEntityReferenceFromGremlin(typeName, gremlinQuery);
}
......@@ -216,9 +216,8 @@ public class HiveMetaStoreBridge {
return getEntityReferenceFromDSL(typeName, dslQuery);
}
private Referenceable getEntityReferenceFromGremlin(String typeName, String gremlinQuery) throws
AtlasServiceException,
JSONException {
private Referenceable getEntityReferenceFromGremlin(String typeName, String gremlinQuery)
throws AtlasServiceException, JSONException {
AtlasClient client = getAtlasClient();
JSONObject response = client.searchByGremlin(gremlinQuery);
JSONArray results = response.getJSONArray(AtlasClient.RESULTS);
......@@ -236,7 +235,8 @@ public class HiveMetaStoreBridge {
//todo replace gremlin with DSL
// String dslQuery = String.format("%s as p where values = %s, tableName where name = '%s', "
// + "dbName where name = '%s' and clusterName = '%s' select p", typeName, valuesStr, tableName,
// + "dbName where name = '%s' and clusterName = '%s' select p", typeName, valuesStr,
// tableName,
// dbName, clusterName);
String datasetType = AtlasClient.DATA_SET_SUPER_TYPE;
......@@ -373,9 +373,8 @@ public class HiveMetaStoreBridge {
return partRef;
}
private void importIndexes(String db, String table,
Referenceable dbReferenceable,
Referenceable tableReferenceable) throws Exception {
private void importIndexes(String db, String table, Referenceable dbReferenceable, Referenceable tableReferenceable)
throws Exception {
List<Index> indexes = hiveClient.getIndexes(db, table, Short.MAX_VALUE);
if (indexes.size() > 0) {
for (Index index : indexes) {
......@@ -385,9 +384,8 @@ public class HiveMetaStoreBridge {
}
//todo should be idempotent
private void importIndex(Index index,
Referenceable dbReferenceable,
Referenceable tableReferenceable) throws Exception {
private void importIndex(Index index, Referenceable dbReferenceable, Referenceable tableReferenceable)
throws Exception {
LOG.info("Importing index {} for {}.{}", index.getIndexName(), dbReferenceable, tableReferenceable);
Referenceable indexRef = new Referenceable(HiveDataTypes.HIVE_INDEX.getName());
......@@ -411,7 +409,8 @@ public class HiveMetaStoreBridge {
createInstance(indexRef);
}
private Referenceable fillStorageDescStruct(StorageDescriptor storageDesc, List<Referenceable> colList) throws Exception {
private Referenceable fillStorageDescStruct(StorageDescriptor storageDesc, List<Referenceable> colList)
throws Exception {
LOG.debug("Filling storage descriptor information for " + storageDesc);
Referenceable sdReferenceable = new Referenceable(HiveDataTypes.HIVE_STORAGEDESC.getName());
......@@ -429,7 +428,8 @@ public class HiveMetaStoreBridge {
sdReferenceable.set("serdeInfo", serdeInfoStruct);
sdReferenceable.set(HiveDataModelGenerator.STORAGE_NUM_BUCKETS, storageDesc.getNumBuckets());
sdReferenceable.set(HiveDataModelGenerator.STORAGE_IS_STORED_AS_SUB_DIRS, storageDesc.isStoredAsSubDirectories());
sdReferenceable
.set(HiveDataModelGenerator.STORAGE_IS_STORED_AS_SUB_DIRS, storageDesc.isStoredAsSubDirectories());
//Use the passed column list if not null, ex: use same references for table and SD
List<FieldSchema> columns = storageDesc.getCols();
......@@ -469,8 +469,7 @@ public class HiveMetaStoreBridge {
return createInstance(sdReferenceable);
}
private List<Referenceable> getColumns(List<FieldSchema> schemaList) throws Exception
{
private List<Referenceable> getColumns(List<FieldSchema> schemaList) throws Exception {
List<Referenceable> colList = new ArrayList<>();
for (FieldSchema fs : schemaList) {
LOG.debug("Processing field " + fs);
......@@ -489,7 +488,7 @@ public class HiveMetaStoreBridge {
AtlasClient dgiClient = getAtlasClient();
//Register hive data model if its not already registered
if (dgiClient.getType(HiveDataTypes.HIVE_PROCESS.getName()) == null ) {
if (dgiClient.getType(HiveDataTypes.HIVE_PROCESS.getName()) == null) {
LOG.info("Registering Hive data model");
dgiClient.createType(dataModelGenerator.getModelAsJson());
} else {
......
......@@ -48,7 +48,7 @@ import java.util.concurrent.ThreadPoolExecutor;
import java.util.concurrent.TimeUnit;
/**
* DgiHook sends lineage information to the DgiSever.
* AtlasHook sends lineage information to the AtlasSever.
*/
public class HiveHook implements ExecuteWithHookContext {
......@@ -83,7 +83,7 @@ public class HiveHook implements ExecuteWithHookContext {
executor = new ThreadPoolExecutor(minThreads, maxThreads, keepAliveTime, TimeUnit.MILLISECONDS,
new LinkedBlockingQueue<Runnable>(),
new ThreadFactoryBuilder().setDaemon(true).setNameFormat("DGI Logger %d").build());
new ThreadFactoryBuilder().setDaemon(true).setNameFormat("Atlas Logger %d").build());
try {
Runtime.getRuntime().addShutdownHook(new Thread() {
......@@ -103,7 +103,7 @@ public class HiveHook implements ExecuteWithHookContext {
LOG.info("Attempting to send msg while shutdown in progress.");
}
LOG.info("Created DGI Hook");
LOG.info("Created Atlas Hook");
}
class HiveEvent {
......@@ -151,7 +151,7 @@ public class HiveHook implements ExecuteWithHookContext {
try {
fireAndForget(event);
} catch (Throwable e) {
LOG.info("DGI hook failed", e);
LOG.info("Atlas hook failed", e);
}
}
});
......@@ -161,7 +161,7 @@ public class HiveHook implements ExecuteWithHookContext {
private void fireAndForget(HiveEvent event) throws Exception {
assert event.hookType == HookContext.HookType.POST_EXEC_HOOK : "Non-POST_EXEC_HOOK not supported!";
LOG.info("Entered DGI hook for hook type {} operation {}", event.hookType, event.operation);
LOG.info("Entered Atlas hook for hook type {} operation {}", event.hookType, event.operation);
HiveMetaStoreBridge dgiBridge = new HiveMetaStoreBridge(event.conf);
if (!typesRegistered) {
......@@ -331,7 +331,7 @@ public class HiveHook implements ExecuteWithHookContext {
explain.initialize(event.conf, event.queryPlan, null);
List<Task<?>> rootTasks = event.queryPlan.getRootTasks();
return explain.getJSONPlan(null, null, rootTasks, event.queryPlan.getFetchTask(), true, false, false);
} catch(Exception e) {
} catch (Exception e) {
LOG.warn("Failed to get queryplan", e);
return new JSONObject();
}
......
---+ Hive DGI Bridge
Hive metadata can be modelled in DGI using its Type System. The default modelling is available in org.apache.atlas.hive.model.HiveDataModelGenerator. It defines the following types:
---+ Hive Atlas Bridge
Hive metadata can be modelled in Atlas using its Type System. The default modelling is available in org.apache.atlas.hive.model.HiveDataModelGenerator. It defines the following types:
* hive_resource_type(EnumType) - [JAR, FILE, ARCHIVE]
* hive_principal_type(EnumType) - [USER, ROLE, GROUP]
* hive_function_type(EnumType) - [JAVA]
......@@ -19,10 +19,10 @@ Hive metadata can be modelled in DGI using its Type System. The default modellin
---++ Importing Hive Metadata
org.apache.atlas.hive.bridge.HiveMetaStoreBridge imports the hive metadata into DGI using the typesystem defined in org.apache.atlas.hive.model.HiveDataModelGenerator. import-hive.sh command can be used to facilitate this.
org.apache.atlas.hive.bridge.HiveMetaStoreBridge imports the hive metadata into Atlas using the typesystem defined in org.apache.atlas.hive.model.HiveDataModelGenerator. import-hive.sh command can be used to facilitate this.
Set-up the following configs in hive-site.xml of your hive set-up and set environment variable HIVE_CONFIG to the
hive conf directory:
* DGI endpoint - Add the following property with the DGI endpoint for your set-up
* Atlas endpoint - Add the following property with the Atlas endpoint for your set-up
<verbatim>
<property>
<name>hive.hook.dgi.url</name>
......@@ -38,8 +38,8 @@ Usage: <dgi package>/bin/import-hive.sh. The logs are in <dgi package>/logs/impo
---++ Hive Hook
Hive supports listeners on hive command execution using hive hooks. This is used to add/update/remove entities in DGI using the model defined in org.apache.atlas.hive.model.HiveDataModelGenerator.
The hook submits the request to a thread pool executor to avoid blocking the command execution. Follow the these instructions in your hive set-up to add hive hook for DGI:
Hive supports listeners on hive command execution using hive hooks. This is used to add/update/remove entities in Atlas using the model defined in org.apache.atlas.hive.model.HiveDataModelGenerator.
The hook submits the request to a thread pool executor to avoid blocking the command execution. Follow the these instructions in your hive set-up to add hive hook for Atlas:
* Add org.apache.atlas.hive.hook.HiveHook as post execution hook in hive-site.xml
<verbatim>
<property>
......@@ -47,7 +47,7 @@ The hook submits the request to a thread pool executor to avoid blocking the com
<value>org.apache.atlas.hive.hook.HiveHook</value>
</property>
</verbatim>
* Add the following properties in hive-ste.xml with the DGI endpoint for your set-up
* Add the following properties in hive-ste.xml with the Atlas endpoint for your set-up
<verbatim>
<property>
<name>hive.hook.dgi.url</name>
......
......@@ -70,27 +70,22 @@ public class BaseSSLAndKerberosTest extends BaseSecurityTest {
file.delete();
conf.set(CredentialProviderFactory.CREDENTIAL_PROVIDER_PATH, providerUrl);
CredentialProvider provider =
CredentialProviderFactory.getProviders(conf).get(0);
CredentialProvider provider = CredentialProviderFactory.getProviders(conf).get(0);
// create new aliases
try {
char[] storepass = {'k', 'e', 'y', 'p', 'a', 's', 's'};
provider.createCredentialEntry(
KEYSTORE_PASSWORD_KEY, storepass);
provider.createCredentialEntry(KEYSTORE_PASSWORD_KEY, storepass);
char[] trustpass = {'k', 'e', 'y', 'p', 'a', 's', 's'};
provider.createCredentialEntry(
TRUSTSTORE_PASSWORD_KEY, trustpass);
provider.createCredentialEntry(TRUSTSTORE_PASSWORD_KEY, trustpass);
char[] trustpass2 = {'k', 'e', 'y', 'p', 'a', 's', 's'};
provider.createCredentialEntry(
"ssl.client.truststore.password", trustpass2);
provider.createCredentialEntry("ssl.client.truststore.password", trustpass2);
char[] certpass = {'k', 'e', 'y', 'p', 'a', 's', 's'};
provider.createCredentialEntry(
SERVER_CERT_PASSWORD_KEY, certpass);
provider.createCredentialEntry(SERVER_CERT_PASSWORD_KEY, certpass);
// write out so that it can be found in checks
provider.flush();
......@@ -132,8 +127,7 @@ public class BaseSSLAndKerberosTest extends BaseSecurityTest {
hiveConf.setVar(HiveConf.ConfVars.PREEXECHOOKS, "");
hiveConf.setVar(HiveConf.ConfVars.POSTEXECHOOKS, HiveHook.class.getName());
hiveConf.setBoolVar(HiveConf.ConfVars.HIVE_SUPPORT_CONCURRENCY, false);
hiveConf.setVar(HiveConf.ConfVars.METASTOREWAREHOUSE,
System.getProperty("user.dir") + "/target/atlas");
hiveConf.setVar(HiveConf.ConfVars.METASTOREWAREHOUSE, System.getProperty("user.dir") + "/target/atlas");
hiveConf.set(HiveMetaStoreBridge.DGI_URL_PROPERTY, DGI_URL);
hiveConf.set("javax.jdo.option.ConnectionURL", "jdbc:derby:./target/metastore_db;create=true");
hiveConf.set("hive.hook.dgi.synchronous", "true");
......
......@@ -121,8 +121,8 @@ public class HiveHookIT {
private String createTable(boolean partition) throws Exception {
String tableName = tableName();
runCommand("create table " + tableName + "(id int, name string) comment 'table comment' "
+ (partition ? " partitioned by(dt string)" : ""));
runCommand("create table " + tableName + "(id int, name string) comment 'table comment' " + (partition ?
" partitioned by(dt string)" : ""));
return tableName;
}
......@@ -146,7 +146,7 @@ public class HiveHookIT {
final Id sdId = (Id) tableRef.get("sd");
Referenceable sdRef = dgiCLient.getEntity(sdId.id);
Assert.assertEquals(sdRef.get(HiveDataModelGenerator.STORAGE_IS_STORED_AS_SUB_DIRS),false);
Assert.assertEquals(sdRef.get(HiveDataModelGenerator.STORAGE_IS_STORED_AS_SUB_DIRS), false);
//Create table where database doesn't exist, will create database instance as well
assertDatabaseIsRegistered(DEFAULT_DB);
......@@ -154,7 +154,8 @@ public class HiveHookIT {
private String assertColumnIsRegistered(String colName) throws Exception {
LOG.debug("Searching for column {}", colName);
String query = String.format("%s where name = '%s'", HiveDataTypes.HIVE_COLUMN.getName(), colName.toLowerCase());
String query =
String.format("%s where name = '%s'", HiveDataTypes.HIVE_COLUMN.getName(), colName.toLowerCase());
return assertEntityIsRegistered(query, true);
}
......@@ -196,8 +197,9 @@ public class HiveHookIT {
public void testInsert() throws Exception {
String tableName = createTable();
String insertTableName = createTable();
String query = "insert into " + insertTableName + " partition(dt = '2015-01-01') select id, name from "
+ tableName + " where dt = '2015-01-01'";
String query =
"insert into " + insertTableName + " partition(dt = '2015-01-01') select id, name from " + tableName
+ " where dt = '2015-01-01'";
runCommand(query);
assertProcessIsRegistered(query);
......@@ -278,13 +280,14 @@ public class HiveHookIT {
}
private void assertProcessIsRegistered(String queryStr) throws Exception {
// String dslQuery = String.format("%s where queryText = \"%s\"", HiveDataTypes.HIVE_PROCESS.getName(),
// normalize(queryStr));
// assertEntityIsRegistered(dslQuery, true);
// String dslQuery = String.format("%s where queryText = \"%s\"", HiveDataTypes.HIVE_PROCESS.getName(),
// normalize(queryStr));
// assertEntityIsRegistered(dslQuery, true);
//todo replace with DSL
String typeName = HiveDataTypes.HIVE_PROCESS.getName();
String gremlinQuery = String.format("g.V.has('__typeName', '%s').has('%s.queryText', \"%s\").toList()",
typeName, typeName, normalize(queryStr));
String gremlinQuery =
String.format("g.V.has('__typeName', '%s').has('%s.queryText', \"%s\").toList()", typeName, typeName,
normalize(queryStr));
JSONObject response = dgiCLient.searchByGremlin(gremlinQuery);
JSONArray results = response.getJSONArray(AtlasClient.RESULTS);
Assert.assertEquals(results.length(), 1);
......@@ -307,9 +310,9 @@ public class HiveHookIT {
private String assertTableIsRegistered(String dbName, String tableName, boolean registered) throws Exception {
LOG.debug("Searching for table {}.{}", dbName, tableName);
String query = String.format("%s as t where tableName = '%s', db where name = '%s' and clusterName = '%s'"
+ " select t", HiveDataTypes.HIVE_TABLE.getName(), tableName.toLowerCase(), dbName.toLowerCase(),
CLUSTER_NAME);
String query = String.format(
"%s as t where tableName = '%s', db where name = '%s' and clusterName = '%s'" + " select t",
HiveDataTypes.HIVE_TABLE.getName(), tableName.toLowerCase(), dbName.toLowerCase(), CLUSTER_NAME);
return assertEntityIsRegistered(query, registered);
}
......@@ -336,7 +339,7 @@ public class HiveHookIT {
Assert.assertEquals(results.length(), 1);
}
private String assertEntityIsRegistered(String dslQuery, boolean registered) throws Exception{
private String assertEntityIsRegistered(String dslQuery, boolean registered) throws Exception {
JSONArray results = dgiCLient.searchByDSL(dslQuery);
if (registered) {
Assert.assertEquals(results.length(), 1);
......
......@@ -92,7 +92,8 @@ public class NegativeSSLAndKerberosHiveHookIT extends BaseSSLAndKerberosTest {
configuration.setProperty(KEYSTORE_FILE_KEY, "../../webapp/target/atlas.keystore");
configuration.setProperty(CERT_STORES_CREDENTIAL_PROVIDER_PATH, providerUrl);
configuration.setProperty("atlas.http.authentication.type", "kerberos");
configuration.setProperty(SSLFactory.SSL_HOSTNAME_VERIFIER_KEY, SSLHostnameVerifier.DEFAULT_AND_LOCALHOST.toString());
configuration.setProperty(SSLFactory.SSL_HOSTNAME_VERIFIER_KEY,
SSLHostnameVerifier.DEFAULT_AND_LOCALHOST.toString());
configuration.save(new FileWriter(persistDir + File.separator + "client.properties"));
......
......@@ -18,8 +18,8 @@
package org.apache.atlas.hive.hook;
import org.apache.atlas.AtlasException;
import org.apache.atlas.AtlasClient;
import org.apache.atlas.AtlasException;
import org.apache.atlas.PropertiesUtil;
import org.apache.atlas.hive.model.HiveDataTypes;
import org.apache.atlas.security.SecurityProperties;
......@@ -107,7 +107,8 @@ public class SSLAndKerberosHiveHookIT extends BaseSSLAndKerberosTest {
configuration.setProperty(KEYSTORE_FILE_KEY, "../../webapp/target/atlas.keystore");
configuration.setProperty(CERT_STORES_CREDENTIAL_PROVIDER_PATH, providerUrl);
configuration.setProperty("atlas.http.authentication.type", "kerberos");
configuration.setProperty(SSLFactory.SSL_HOSTNAME_VERIFIER_KEY, SSLHostnameVerifier.DEFAULT_AND_LOCALHOST.toString());
configuration.setProperty(SSLFactory.SSL_HOSTNAME_VERIFIER_KEY,
SSLHostnameVerifier.DEFAULT_AND_LOCALHOST.toString());
configuration.save(new FileWriter(persistDir + File.separator + "client.properties"));
......@@ -215,7 +216,8 @@ public class SSLAndKerberosHiveHookIT extends BaseSSLAndKerberosTest {
assertInstanceIsRegistered(HiveDataTypes.HIVE_DB.getName(), "name", dbName);
}
private void assertInstanceIsRegistered(final String typeName, final String colName, final String colValue) throws Exception {
private void assertInstanceIsRegistered(final String typeName, final String colName, final String colValue)
throws Exception {
Subject.doAs(subject, new PrivilegedExceptionAction<Object>() {
@Override
public Object run() throws Exception {
......
......@@ -18,8 +18,8 @@
package org.apache.atlas.hive.hook;
import org.apache.atlas.AtlasException;
import org.apache.atlas.AtlasClient;
import org.apache.atlas.AtlasException;
import org.apache.atlas.hive.bridge.HiveMetaStoreBridge;
import org.apache.atlas.hive.model.HiveDataTypes;
import org.apache.atlas.security.SecurityProperties;
......@@ -73,7 +73,9 @@ public class SSLHiveHookIT {
super(port, path);
}
public Server getServer () { return server; }
public Server getServer() {
return server;
}
@Override
public PropertiesConfiguration getConfiguration() {
......@@ -113,7 +115,8 @@ public class SSLHiveHookIT {
configuration.setProperty(TRUSTSTORE_FILE_KEY, "../../webapp/target/atlas.keystore");
configuration.setProperty(KEYSTORE_FILE_KEY, "../../webapp/target/atlas.keystore");
configuration.setProperty(CERT_STORES_CREDENTIAL_PROVIDER_PATH, providerUrl);
configuration.setProperty(SSLFactory.SSL_HOSTNAME_VERIFIER_KEY, SSLHostnameVerifier.DEFAULT_AND_LOCALHOST.toString());
configuration.setProperty(SSLFactory.SSL_HOSTNAME_VERIFIER_KEY,
SSLHostnameVerifier.DEFAULT_AND_LOCALHOST.toString());
configuration.save(new FileWriter(persistDir + File.separator + "client.properties"));
......@@ -153,27 +156,22 @@ public class SSLHiveHookIT {
file.delete();
conf.set(CredentialProviderFactory.CREDENTIAL_PROVIDER_PATH, providerUrl);
CredentialProvider provider =
CredentialProviderFactory.getProviders(conf).get(0);
CredentialProvider provider = CredentialProviderFactory.getProviders(conf).get(0);
// create new aliases
try {
char[] storepass = {'k', 'e', 'y', 'p', 'a', 's', 's'};
provider.createCredentialEntry(
KEYSTORE_PASSWORD_KEY, storepass);
provider.createCredentialEntry(KEYSTORE_PASSWORD_KEY, storepass);
char[] trustpass = {'k', 'e', 'y', 'p', 'a', 's', 's'};
provider.createCredentialEntry(
TRUSTSTORE_PASSWORD_KEY, trustpass);
provider.createCredentialEntry(TRUSTSTORE_PASSWORD_KEY, trustpass);
char[] trustpass2 = {'k', 'e', 'y', 'p', 'a', 's', 's'};
provider.createCredentialEntry(
"ssl.client.truststore.password", trustpass2);
provider.createCredentialEntry("ssl.client.truststore.password", trustpass2);
char[] certpass = {'k', 'e', 'y', 'p', 'a', 's', 's'};
provider.createCredentialEntry(
SERVER_CERT_PASSWORD_KEY, certpass);
provider.createCredentialEntry(SERVER_CERT_PASSWORD_KEY, certpass);
// write out so that it can be found in checks
provider.flush();
......@@ -217,7 +215,7 @@ public class SSLHiveHookIT {
assertInstanceIsRegistered(HiveDataTypes.HIVE_DB.getName(), "name", dbName);
}
private void assertInstanceIsRegistered(String typeName, String colName, String colValue) throws Exception{
private void assertInstanceIsRegistered(String typeName, String colName, String colValue) throws Exception {
JSONArray results = dgiCLient.rawSearch(typeName, colName, colValue);
Assert.assertEquals(results.length(), 1);
}
......
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
/test-output/
<!--
~ Licensed to the Apache Software Foundation (ASF) under one
~ or more contributor license agreements. See the NOTICE file
~ distributed with this work for additional information
~ regarding copyright ownership. The ASF licenses this file
~ to you under the Apache License, Version 2.0 (the
~ "License"); you may not use this file except in compliance
~ with the License. You may obtain a copy of the License at
~
~ http://www.apache.org/licenses/LICENSE-2.0
~
~ Unless required by applicable law or agreed to in writing, software
~ distributed under the License is distributed on an "AS IS" BASIS,
~ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
~ See the License for the specific language governing permissions and
~ limitations under the License.
-->
<project xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns="http://maven.apache.org/POM/4.0.0"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.apache.atlas</groupId>
<artifactId>atlas-bridge-parent</artifactId>
<version>0.1-incubating-SNAPSHOT</version>
</parent>
<artifactId>atlas-bridge-core</artifactId>
<dependencies>
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-metastore</artifactId>
<version>0.14.0</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>2.2.2</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.10</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.atlas</groupId>
<artifactId>atlas-repository</artifactId>
</dependency>
<dependency>
<groupId>com.sun.jersey</groupId>
<artifactId>jersey-core</artifactId>
</dependency>
<dependency>
<groupId>org.mortbay.jetty</groupId>
<artifactId>jetty</artifactId>
</dependency>
<dependency>
<groupId>org.testng</groupId>
<artifactId>testng</artifactId>
<version>6.1.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>commons-configuration</groupId>
<artifactId>commons-configuration</artifactId>
</dependency>
<dependency>
<groupId>com.google.inject.extensions</groupId>
<artifactId>guice-multibindings</artifactId>
<version>3.0</version>
</dependency>
</dependencies>
</project>
\ No newline at end of file
/**
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
* <p/>
* http://www.apache.org/licenses/LICENSE-2.0
* <p/>
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.atlas.bridge;
import com.google.common.collect.ImmutableList;
import org.apache.atlas.MetadataException;
import org.apache.atlas.repository.MetadataRepository;
import org.apache.atlas.repository.RepositoryException;
import org.apache.atlas.typesystem.ITypedReferenceableInstance;
import org.apache.atlas.typesystem.Referenceable;
import org.apache.atlas.typesystem.types.AttributeDefinition;
import org.apache.atlas.typesystem.types.AttributeInfo;
import org.apache.atlas.typesystem.types.ClassType;
import org.apache.atlas.typesystem.types.HierarchicalTypeDefinition;
import org.apache.atlas.typesystem.types.Multiplicity;
import org.apache.atlas.typesystem.types.TypeSystem;
import org.slf4j.Logger;
import java.lang.reflect.Field;
import java.lang.reflect.InvocationTargetException;
import java.util.ArrayList;
import java.util.List;
import java.util.Map.Entry;
public abstract class ABridge implements IBridge {
protected static final Logger LOG = BridgeManager.LOG;
protected ArrayList<Class<? extends AEntityBean>> typeBeanClasses
= new ArrayList<Class<? extends AEntityBean>>();
MetadataRepository repo;
protected ABridge(MetadataRepository repo) {
this.repo = repo;
}
protected HierarchicalTypeDefinition<ClassType> createClassTypeDef(String name,
ImmutableList<String>
superTypes,
AttributeDefinition...
attrDefs) {
return new HierarchicalTypeDefinition(ClassType.class, name, superTypes, attrDefs);
}
public ArrayList<Class<? extends AEntityBean>> getTypeBeanClasses() {
return typeBeanClasses;
}
public AEntityBean get(String id) throws RepositoryException {
// get from the system by id (?)
ITypedReferenceableInstance ref = repo.getEntityDefinition(id);
// turn into a HiveLineageBean
try {
Class<AEntityBean> c = getTypeBeanInListByName(ref.getTypeName());
return this.convertFromITypedReferenceable(ref,
getTypeBeanInListByName(ref.getTypeName()));
} catch (BridgeException | InstantiationException | IllegalAccessException |
IllegalArgumentException | InvocationTargetException | NoSuchMethodException |
SecurityException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
return null;
}
public String create(AEntityBean bean) throws MetadataException {
ClassType type = TypeSystem.getInstance()
.getDataType(ClassType.class, bean.getClass().getSimpleName());
ITypedReferenceableInstance refBean = null;
try {
refBean = type.convert(this.convertToReferencable(bean), Multiplicity.REQUIRED);
String id = repo.createEntity(refBean);
return id;
} catch (IllegalArgumentException | IllegalAccessException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
throw new MetadataException("Cannot create entity");
}
public Iterable<String> list() throws RepositoryException {
List<String> returnList = null;
for (Class c : typeBeanClasses) {
List<String> inBetweenList = repo.getEntityList(c.getSimpleName());
try {
returnList.addAll(inBetweenList);
} catch (NullPointerException e) {
returnList = inBetweenList;
}
}
return returnList;
}
protected final boolean containsType(String s) {
for (Class c : typeBeanClasses) {
if (c.getSimpleName().equals(s)) {
return true;
}
}
return false;
}
protected final Class<AEntityBean> getTypeBeanInListByName(String s) throws BridgeException {
if (containsType(s)) {
for (Class c : typeBeanClasses) {
if (c.getSimpleName().equals(s)) {
return c;
}
}
} else {
throw new BridgeException("No EntityBean Definition Found");
}
throw new BridgeException("No EntityBean Definition Found");
}
protected final <T extends AEntityBean> Referenceable convertToReferencable(T o)
throws IllegalArgumentException, IllegalAccessException {
Referenceable selfAware = new Referenceable(o.getClass().getSimpleName());
// TODO - support non-primitive types and deep inspection
for (Field f : o.getClass().getFields()) {
selfAware.set(f.getName(), f.get(o));
}
return selfAware;
}
protected final <T extends AEntityBean> T convertFromITypedReferenceable(
ITypedReferenceableInstance instance, Class<? extends AEntityBean> c)
throws InstantiationException, IllegalAccessException, IllegalArgumentException,
InvocationTargetException, NoSuchMethodException, SecurityException, BridgeException {
if (!instance.getTypeName().equals(c.getSimpleName())) {
throw new BridgeException("ReferenceableInstance type not the same as bean");
}
Object retObj = this.getClass().newInstance();
for (Entry<String, AttributeInfo> e : instance.fieldMapping().fields.entrySet()) {
try {
String convertedName = e.getKey().substring(0, 1).toUpperCase() +
e.getKey().substring(1);
this.getClass().getMethod("set" + convertedName,
Class.forName(e.getValue().dataType().getName()))
.invoke(this, instance.get(e.getKey()));
} catch (MetadataException | ClassNotFoundException e1) {
// TODO Auto-generated catch block
e1.printStackTrace();
}
}
return (T) retObj;
}
}
/**
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
* <p/>
* http://www.apache.org/licenses/LICENSE-2.0
* <p/>
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.atlas.bridge;
import org.apache.hadoop.hive.metastore.api.MetaException;
public class BridgeException extends MetaException {
/**
*
*/
private static final long serialVersionUID = -384401342591560473L;
public BridgeException(String msg) {
super(msg);
}
}
/**
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
* <p/>
* http://www.apache.org/licenses/LICENSE-2.0
* <p/>
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.atlas.bridge;
//TODO - Create Index Annotation Framework for BeanConverter
//TODO - Enhance Bean Conversion to handled nested objects
//TODO - Enhance Bean COnversion to handle Collections
import org.apache.atlas.MetadataException;
import org.apache.atlas.repository.MetadataRepository;
import org.apache.atlas.typesystem.types.AttributeDefinition;
import org.apache.atlas.typesystem.types.ClassType;
import org.apache.atlas.typesystem.types.HierarchicalTypeDefinition;
import org.apache.atlas.typesystem.types.Multiplicity;
import org.apache.atlas.typesystem.types.TypeSystem;
import org.apache.commons.configuration.ConfigurationException;
import org.apache.commons.configuration.PropertiesConfiguration;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import javax.inject.Inject;
import java.lang.reflect.Field;
import java.lang.reflect.InvocationTargetException;
import java.util.ArrayList;
public class BridgeManager {
public static final Logger LOG = LoggerFactory.getLogger("BridgeLogger");
private final static String bridgeFileDefault = "bridge-manager.properties";
TypeSystem ts;
MetadataRepository rs;
ArrayList<ABridge> activeBridges;
@Inject
BridgeManager(MetadataRepository rs)
throws ConfigurationException, ClassNotFoundException, InstantiationException,
IllegalAccessException, IllegalArgumentException, InvocationTargetException,
NoSuchMethodException, SecurityException {
this.ts = TypeSystem.getInstance();
this.rs = rs;
if (System.getProperty("bridgeManager.propsFile") != null &&
System.getProperty("bridgeManager.propsFile").length() != 0) {
setActiveBridges(System.getProperty("bridgeManager.propsFile"));
} else {
setActiveBridges(bridgeFileDefault);
}
for (ABridge bridge : activeBridges) {
try {
this.loadTypes(bridge, ts);
} catch (MetadataException e) {
BridgeManager.LOG.error(e.getMessage(), e);
e.printStackTrace();
}
}
}
public final static HierarchicalTypeDefinition<ClassType>
convertEntityBeanToClassTypeDefinition(
Class<? extends AEntityBean> class1) {
ArrayList<AttributeDefinition> attDefAL = new ArrayList<AttributeDefinition>();
for (Field f : class1.getFields()) {
try {
attDefAL.add(BridgeManager.convertFieldtoAttributeDefiniton(f));
} catch (MetadataException e) {
BridgeManager.LOG.error("Class " + class1.getName() +
" cannot be converted to TypeDefinition");
e.printStackTrace();
}
}
HierarchicalTypeDefinition<ClassType> typeDef = new HierarchicalTypeDefinition<>(
ClassType.class, class1.getSimpleName(),
null, (AttributeDefinition[]) attDefAL.toArray(new AttributeDefinition[0]));
return typeDef;
}
public final static AttributeDefinition convertFieldtoAttributeDefiniton(Field f)
throws MetadataException {
return new AttributeDefinition(f.getName(), f.getType().getSimpleName(),
Multiplicity.REQUIRED, false, null);
}
public ArrayList<ABridge> getActiveBridges() {
return this.activeBridges;
}
private void setActiveBridges(String bridgePropFileName) {
if (bridgePropFileName == null || bridgePropFileName.isEmpty()) {
bridgePropFileName = BridgeManager.bridgeFileDefault;
}
ArrayList<ABridge> aBList = new ArrayList<ABridge>();
PropertiesConfiguration config = new PropertiesConfiguration();
try {
BridgeManager.LOG.info("Loading : Active Bridge List");
config.load(bridgePropFileName);
String[] activeBridgeList = ((String) config.getProperty("BridgeManager.activeBridges"))
.split(",");
BridgeManager.LOG.info("Loaded : Active Bridge List");
BridgeManager.LOG.info("First Loaded :" + activeBridgeList[0]);
for (String s : activeBridgeList) {
Class<?> bridgeCls = (Class<?>) Class.forName(s);
if (ABridge.class.isAssignableFrom(bridgeCls)) {
System.out.println(s + " is able to be instaciated");
aBList.add((ABridge) bridgeCls.getConstructor(MetadataRepository.class)
.newInstance(rs));
}
}
} catch (InstantiationException | ConfigurationException | IllegalAccessException |
IllegalArgumentException | InvocationTargetException | NoSuchMethodException |
SecurityException | ClassNotFoundException e) {
BridgeManager.LOG.error(e.getMessage(), e);
e.printStackTrace();
}
this.activeBridges = aBList;
}
private final boolean loadTypes(ABridge bridge, TypeSystem ts) throws MetadataException {
for (Class<? extends AEntityBean> clazz : bridge.getTypeBeanClasses()) {
ts.defineClassType(BridgeManager.convertEntityBeanToClassTypeDefinition(clazz));
}
return false;
}
}
/**
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
* <p/>
* http://www.apache.org/licenses/LICENSE-2.0
* <p/>
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.atlas.bridge;
import org.apache.atlas.MetadataException;
import org.apache.atlas.typesystem.types.AttributeDefinition;
import org.apache.atlas.typesystem.types.ClassType;
import org.apache.atlas.typesystem.types.HierarchicalTypeDefinition;
import org.apache.atlas.typesystem.types.Multiplicity;
import org.apache.atlas.typesystem.types.TypeSystem;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import javax.inject.Inject;
import javax.inject.Singleton;
import java.lang.reflect.Field;
import java.util.ArrayList;
import java.util.Map;
@Singleton
public class BridgeTypeBootstrapper {
private static final Logger LOG = LoggerFactory.getLogger(BridgeTypeBootstrapper.class);
private final Map<Class, IBridge> bridges;
private boolean isSetup = false;
@Inject
BridgeTypeBootstrapper(Map<Class, IBridge> bridges)
throws MetadataException {
this.bridges = bridges;
}
public final static HierarchicalTypeDefinition<ClassType>
convertEntityBeanToClassTypeDefinition(
Class<? extends AEntityBean> class1) {
ArrayList<AttributeDefinition> attDefAL = new ArrayList<AttributeDefinition>();
for (Field f : class1.getFields()) {
try {
attDefAL.add(BridgeTypeBootstrapper.convertFieldtoAttributeDefiniton(f));
} catch (MetadataException e) {
BridgeManager.LOG.error("Class " + class1.getName()
+ " cannot be converted to TypeDefinition");
e.printStackTrace();
}
}
HierarchicalTypeDefinition<ClassType> typeDef = new HierarchicalTypeDefinition<>(
ClassType.class, class1.getSimpleName(), null,
(AttributeDefinition[]) attDefAL
.toArray(new AttributeDefinition[0]));
return typeDef;
}
public final static AttributeDefinition convertFieldtoAttributeDefiniton(
Field f) throws MetadataException {
return new AttributeDefinition(f.getName(),
f.getType().getSimpleName().toLowerCase(), Multiplicity.REQUIRED, false, null);
}
public synchronized boolean bootstrap() throws MetadataException {
if (isSetup)
return false;
else {
LOG.info("Bootstrapping types");
_bootstrap();
isSetup = true;
LOG.info("Bootstrapping complete");
return true;
}
}
private void _bootstrap() throws MetadataException {
TypeSystem ts = TypeSystem.getInstance();
for (IBridge bridge : bridges.values()) {
LOG.info("Registering bridge, %s", bridge.getClass().getSimpleName());
loadTypes(bridge, ts);
}
}
private final boolean loadTypes(IBridge bridge, TypeSystem ts)
throws MetadataException {
for (Class<? extends AEntityBean> clazz : bridge.getTypeBeanClasses()) {
LOG.info("Registering %s", clazz.getSimpleName());
ts.defineClassType(BridgeTypeBootstrapper
.convertEntityBeanToClassTypeDefinition(clazz));
}
return false;
}
}
/**
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
* <p/>
* http://www.apache.org/licenses/LICENSE-2.0
* <p/>
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.atlas.bridge;
import java.util.ArrayList;
public interface IBridge {
ArrayList<Class<? extends AEntityBean>> getTypeBeanClasses();
}
/**
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
* <p/>
* http://www.apache.org/licenses/LICENSE-2.0
* <p/>
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.atlas.bridge.hivelineage;
import org.apache.atlas.bridge.ABridge;
import org.apache.atlas.bridge.hivelineage.hook.HiveLineage;
import org.apache.atlas.repository.MetadataRepository;
import javax.inject.Inject;
public class HiveLineageBridge extends ABridge {
@Inject
HiveLineageBridge(MetadataRepository mr) {
super(mr);
this.typeBeanClasses.add(HiveLineage.class);
}
}
/**
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
* <p/>
* http://www.apache.org/licenses/LICENSE-2.0
* <p/>
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.atlas.bridge.hivelineage.hook;
import org.apache.atlas.bridge.AEntityBean;
import java.io.Serializable;
import java.util.ArrayList;
public class HiveLineage extends AEntityBean implements Serializable {
/**
*
*/
private static final long serialVersionUID = 1L;
public String queryId;
public String hiveId;
public String user;
public String queryStartTime;
public String queryEndTime;
public String query;
public String tableName;
public String tableLocation;
public boolean success;
public boolean failed;
public String executionEngine;
ArrayList<SourceTables> sourceTables;
ArrayList<QueryColumns> queryColumns;
ArrayList<WhereClause> whereClause;
ArrayList<CreateColumns> createColumns;
ArrayList<GroupBy> groupBy;
ArrayList<GroupBy> orderBy;
public String getQueryId() {
return this.queryId;
}
public void setQueryId(String queryId) {
this.queryId = queryId;
}
public String getExecutionEngine() {
return this.executionEngine;
}
public void setExecutionEngine(String executionEngine) {
this.executionEngine = executionEngine;
}
public String getHiveId() {
return this.hiveId;
}
public void setHiveId(String hiveId) {
this.hiveId = hiveId;
}
public boolean getSuccess() {
return this.success;
}
public void setSuccess(boolean success) {
this.success = success;
}
public boolean getFailed() {
return this.failed;
}
public void setFailed(boolean failed) {
this.failed = failed;
}
public String getTableName() {
return this.tableName;
}
public void setTableName(String tableName) {
this.tableName = tableName;
}
public String getTableLocation() {
return this.tableLocation;
}
public void setTableLocation(String tableLocation) {
this.tableLocation = tableLocation;
}
public String getUser() {
return this.user;
}
public void setUser(String user) {
this.user = user;
}
public String getQueryStartTime() {
return this.queryStartTime;
}
public void setQueryStartTime(String queryStartTime) {
this.queryStartTime = queryStartTime;
}
public String getQueryEndTime() {
return this.queryEndTime;
}
public void setQueryEndTime(String queryEndTime) {
this.queryEndTime = queryEndTime;
}
public String getQuery() {
return this.query;
}
public void setQuery(String query) {
this.query = query;
}
public ArrayList<SourceTables> getSourceTables() {
return this.sourceTables;
}
public void setSourceTables(ArrayList<SourceTables> sourceTables) {
this.sourceTables = sourceTables;
}
public ArrayList<QueryColumns> getQueryColumns() {
return this.queryColumns;
}
public void setQueryColumns(ArrayList<QueryColumns> queryColumns) {
this.queryColumns = queryColumns;
}
public ArrayList<WhereClause> getWhereClause() {
return this.whereClause;
}
public void setWhereClause(ArrayList<WhereClause> whereClause) {
this.whereClause = whereClause;
}
public ArrayList<GroupBy> getGroupBy() {
return this.groupBy;
}
public void setGroupBy(ArrayList<GroupBy> groupBy) {
this.groupBy = groupBy;
}
public ArrayList<CreateColumns> getCreateColumns() {
return this.createColumns;
}
public void setCreateColumns(ArrayList<CreateColumns> createColumns) {
this.createColumns = createColumns;
}
public class SourceTables {
public String tableName;
public String tableAlias;
public String databaseName;
public String getTableName() {
return this.tableName;
}
public void setTableName(String tableName) {
this.tableName = tableName;
}
public String getTableAlias() {
return this.tableAlias;
}
public void setTableAlias(String tableAlias) {
this.tableAlias = tableAlias;
}
public String getDatabaseName() {
return this.databaseName;
}
public void setDatabaseName(String databaseName) {
this.databaseName = databaseName;
}
}
public class QueryColumns {
public String tbAliasOrName;
public String columnName;
public String columnAlias;
public String columnFunction;
public String getTbAliasOrName() {
return this.tbAliasOrName;
}
public void setTbAliasOrName(String tbAliasOrName) {
this.tbAliasOrName = tbAliasOrName;
}
public String getColumnName() {
return this.columnName;
}
public void setColumnName(String columnName) {
this.columnName = columnName;
}
public String getColumnAlias() {
return this.columnAlias;
}
public void setColumnAlias(String columnAlias) {
this.columnAlias = columnAlias;
}
public String getColumnFunction() {
return this.columnFunction;
}
public void setColumnFunction(String columnFunction) {
this.columnFunction = columnFunction;
}
}
public class GroupBy {
public String tbAliasOrName;
public String columnName;
public String getTbAliasOrName() {
return this.tbAliasOrName;
}
public void setTbAliasOrName(String tbAliasOrName) {
this.tbAliasOrName = tbAliasOrName;
}
public String getColumnName() {
return this.columnName;
}
public void setColumnName(String columnName) {
this.columnName = columnName;
}
}
public class WhereClause {
public String tbAliasOrName;
public String columnCondition;
public String columnName;
public String columnOperator;
public String columnValue;
public String getColumnCondition() {
return this.columnCondition;
}
public void setColumnCondition(String columnCondition) {
this.columnCondition = columnCondition;
}
public String getTbAliasOrName() {
return this.tbAliasOrName;
}
public void setTbAliasOrName(String tbAliasOrName) {
this.tbAliasOrName = tbAliasOrName;
}
public String getColumnName() {
return this.columnName;
}
public void setColumnName(String columnName) {
this.columnName = columnName;
}
public String getColumnOperator() {
return this.columnOperator;
}
public void setColumnOperator(String columnOperator) {
this.columnOperator = columnOperator;
}
public String getColumnValue() {
return this.columnValue;
}
public void setColumnValue(String columnValue) {
this.columnValue = columnValue;
}
}
public class CreateColumns {
public String columnName;
public String columnType;
public String getColumnName() {
return this.columnName;
}
public void setColumnName(String columnName) {
this.columnName = columnName;
}
public String getColumnType() {
return this.columnType;
}
public void setColumnType(String columnType) {
this.columnType = columnType;
}
}
}
/**
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
* <p/>
* http://www.apache.org/licenses/LICENSE-2.0
* <p/>
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.atlas.bridge.hivestructure;
import org.apache.atlas.MetadataException;
import org.apache.atlas.repository.IRepository;
import org.apache.atlas.repository.RepositoryException;
import org.apache.atlas.typesystem.Referenceable;
import org.apache.atlas.typesystem.types.ClassType;
import org.apache.atlas.typesystem.types.TypeSystem;
import org.apache.hadoop.hive.conf.HiveConf;
import org.apache.hadoop.hive.metastore.HiveMetaStoreClient;
import org.apache.hadoop.hive.metastore.api.Database;
import org.apache.hadoop.hive.metastore.api.FieldSchema;
import org.apache.hadoop.hive.metastore.api.MetaException;
import org.apache.hadoop.hive.metastore.api.NoSuchObjectException;
import org.apache.hadoop.hive.metastore.api.Table;
import org.apache.hadoop.hive.metastore.api.UnknownDBException;
import org.apache.hadoop.hive.metastore.api.UnknownTableException;
import org.apache.thrift.TException;
/*
* Initial pass at one time importer TODO - needs re-write
*/
public class HiveMetaImporter {
private static HiveMetaStoreClient msc;
private static IRepository repo;
public HiveMetaImporter(IRepository repo) {
try {
this.repo = repo;
msc = new HiveMetaStoreClient(new HiveConf());
// TODO Get hive-site.conf from class path first
} catch (MetaException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
public static boolean fullImport() {
try {
databasesImport();
for (String dbName : msc.getAllDatabases()) {
tablesImport(dbName);
for (String tbName : msc.getAllTables(dbName)) {
fieldsImport(dbName, tbName);
}
return true;
}
} catch (MetaException me) {
me.printStackTrace();
} catch (RepositoryException re) {
re.printStackTrace();
}
return false;
}
public static boolean databasesImport() throws MetaException, RepositoryException {
ClassType classType = null;
try {
classType = TypeSystem.getInstance()
.getDataType(ClassType.class, HiveStructureBridge.DB_CLASS_TYPE);
} catch (MetadataException e1) {
e1.printStackTrace();
}
for (String dbName : msc.getAllDatabases()) {
databaseImport(dbName);
}
return true;
}
public static boolean databaseImport(String dbName) throws MetaException, RepositoryException {
try {
Database db = msc.getDatabase(dbName);
Referenceable dbRef = new Referenceable(HiveStructureBridge.DB_CLASS_TYPE);
dbRef.set("DESC", db.getDescription());
dbRef.set("DB_LOCATION_URI", db.getLocationUri());
dbRef.set("NAME", db.getName());
if (db.isSetOwnerType()) {
dbRef.set("OWNER_TYPE", db.getOwnerType());
}
if (db.isSetOwnerName()) {
dbRef.set("OWNER_NAME", db.getOwnerName());
}
repo.create(dbRef);
} catch (NoSuchObjectException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (TException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
return true;
}
public static boolean tablesImport(String dbName) throws MetaException, RepositoryException {
ClassType classType = null;
try {
classType = TypeSystem.getInstance()
.getDataType(ClassType.class, HiveStructureBridge.TB_CLASS_TYPE);
} catch (MetadataException e1) {
e1.printStackTrace();
}
for (String tbName : msc.getAllTables(dbName)) {
tableImport(dbName, tbName);
}
return true;
}
public static boolean tableImport(String dbName, String tbName)
throws MetaException, RepositoryException {
try {
Table tb = msc.getTable(dbName, tbName);
Referenceable tbRef = new Referenceable(HiveStructureBridge.TB_CLASS_TYPE);
tbRef.set("CREATE_TIME", tb.getCreateTime());
tbRef.set("LAST_ACCESS_TIME", tb.getLastAccessTime());
tbRef.set("OWNER", tb.getOwner());
tbRef.set("TBL_NAME", tb.getTableName());
tbRef.set("TBL_TYPE", tb.getTableType());
if (tb.isSetViewExpandedText()) {
tbRef.set("VIEW_EXPANDED_TEXT", tb.getViewExpandedText());
}
if (tb.isSetViewOriginalText()) {
tbRef.set("VIEW_ORIGINAL_TEXT", tb.getViewOriginalText());
}
repo.create(tbRef);
} catch (NoSuchObjectException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (TException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
return true;
}
public static boolean fieldsImport(String dbName, String tbName)
throws MetaException, RepositoryException {
ClassType classType = null;
try {
classType = TypeSystem.getInstance()
.getDataType(ClassType.class, HiveStructureBridge.FD_CLASS_TYPE);
} catch (MetadataException e1) {
e1.printStackTrace();
}
try {
for (FieldSchema fs : msc.getFields(dbName, tbName)) {
Referenceable fdRef = new Referenceable(HiveStructureBridge.FD_CLASS_TYPE);
if (fs.isSetComment()) {
fdRef.set("COMMENT", fs.getName());
}
fdRef.set("COLUMN_NAME", fs.getName());
fdRef.set("TYPE_NAME", fs.getType());
repo.create(fdRef);
}
} catch (UnknownTableException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (UnknownDBException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (TException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
return true;
}
public static boolean fieldImport(String dbName, String tbName, String fdName)
throws MetaException {
try {
for (FieldSchema fs : msc.getFields(dbName, tbName)) {
if (fs.getName().equals(fs)) {
Referenceable fdRef = new Referenceable(HiveStructureBridge.TB_CLASS_TYPE);
if (fs.isSetComment()) {
fdRef.set("COMMENT", fs.getName());
}
fdRef.set("COLUMN_NAME", fs.getName());
fdRef.set("TYPE_NAME", fs.getType());
//SaveObject to MS Backend
return true;
}
}
} catch (UnknownTableException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (UnknownDBException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (TException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
return true;
}
}
/**
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
* <p/>
* http://www.apache.org/licenses/LICENSE-2.0
* <p/>
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.atlas.bridge.hivestructure;
import org.apache.atlas.MetadataException;
import org.apache.atlas.bridge.ABridge;
import org.apache.atlas.repository.MetadataRepository;
import org.apache.atlas.typesystem.types.AttributeDefinition;
import org.apache.atlas.typesystem.types.ClassType;
import org.apache.atlas.typesystem.types.HierarchicalTypeDefinition;
import org.apache.atlas.typesystem.types.Multiplicity;
import org.apache.atlas.typesystem.types.TypeSystem;
import javax.inject.Inject;
import java.util.ArrayList;
public class HiveStructureBridge extends ABridge {
static final String DB_CLASS_TYPE = "HiveDatabase";
static final String TB_CLASS_TYPE = "HiveTable";
static final String FD_CLASS_TYPE = "HiveField";
@Inject
protected HiveStructureBridge(MetadataRepository repo) {
super(repo);
// TODO Auto-generated constructor stub
}
public boolean defineBridgeTypes(TypeSystem ts) {
ArrayList<HierarchicalTypeDefinition<?>> al
= new ArrayList<HierarchicalTypeDefinition<?>>();
// TODO
//convert to helper methods
// Add to arrayList
try {
HierarchicalTypeDefinition<ClassType> databaseClassTypeDef
= new HierarchicalTypeDefinition<ClassType>("ClassType", DB_CLASS_TYPE, null,
new AttributeDefinition[]{
new AttributeDefinition("DESC", "STRING_TYPE", Multiplicity.OPTIONAL,
false, null),
new AttributeDefinition("DB_LOCATION_URI", "STRING_TYPE",
Multiplicity.REQUIRED, false, null),
new AttributeDefinition("NAME", "STRING_TYPE", Multiplicity.REQUIRED,
false, null),
new AttributeDefinition("OWNER_TYPE", "STRING_TYPE",
Multiplicity.OPTIONAL, false, null),
new AttributeDefinition("OWNER_NAME", "STRING_TYPE",
Multiplicity.OPTIONAL, false, null)
}
);
HierarchicalTypeDefinition<ClassType> tableClassTypeDef
= new HierarchicalTypeDefinition<ClassType>("ClassType", TB_CLASS_TYPE, null,
new AttributeDefinition[]{
new AttributeDefinition("CREATE_TIME", "LONG_TYPE",
Multiplicity.REQUIRED, false, null),
new AttributeDefinition("LAST_ACCESS_TIME", "LONG_TYPE",
Multiplicity.REQUIRED, false, null),
new AttributeDefinition("OWNER", "STRING_TYPE", Multiplicity.REQUIRED,
false, null),
new AttributeDefinition("TBL_NAME", "STRING_TYPE",
Multiplicity.REQUIRED, false, null),
new AttributeDefinition("TBL_TYPE", "STRING_TYPE",
Multiplicity.REQUIRED, false, null),
new AttributeDefinition("VIEW_EXPANDED_TEXT", "STRING_TYPE",
Multiplicity.OPTIONAL, false, null),
new AttributeDefinition("VIEW_ORIGINAL_TEXT", "STRING_TYPE",
Multiplicity.OPTIONAL, false, null)
}
);
HierarchicalTypeDefinition<ClassType> columnClassTypeDef
= new HierarchicalTypeDefinition<ClassType>("ClassType", FD_CLASS_TYPE, null,
new AttributeDefinition[]{
new AttributeDefinition("COMMENT", "STRING_TYPE", Multiplicity.OPTIONAL,
false, null),
new AttributeDefinition("COLUMN_NAME", "STRING_TYPE",
Multiplicity.REQUIRED, false, null),
new AttributeDefinition("TYPE_NAME", "STRING_TYPE",
Multiplicity.REQUIRED, false, null)
}
);
} catch (ClassNotFoundException e) {
e.printStackTrace();
}
for (HierarchicalTypeDefinition htd : al) {
try {
ts.defineClassType(htd);
} catch (MetadataException e) {
System.out.println(
htd.hierarchicalMetaTypeName + "could not be added to the type system");
e.printStackTrace();
}
}
return false;
}
}
/**
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
* <p/>
* http://www.apache.org/licenses/LICENSE-2.0
* <p/>
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.atlas.bridge.module;
import com.google.inject.AbstractModule;
import com.google.inject.Scopes;
import com.google.inject.multibindings.MapBinder;
import org.apache.atlas.RepositoryMetadataModule;
import org.apache.atlas.bridge.BridgeTypeBootstrapper;
import org.apache.atlas.bridge.IBridge;
import org.apache.commons.configuration.ConfigurationException;
import org.apache.commons.configuration.PropertiesConfiguration;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.util.ArrayList;
import java.util.List;
public class BridgeModule extends AbstractModule {
public static final Logger LOG = LoggerFactory
.getLogger(BridgeModule.class);
@Override
protected void configure() {
install(new RepositoryMetadataModule());
// make sure the BridgeTypeBootstrapper is only ever created once
bind(BridgeTypeBootstrapper.class).in(Scopes.SINGLETON);
// Load the configured bridge classes and add them to the map binder
MapBinder<Class, IBridge> mapbinder = MapBinder.newMapBinder(binder(),
Class.class, IBridge.class);
String propsURI = System.getProperty("bridgeManager.propsFile",
"bridge-manager.properties");
List<Class<? extends IBridge>> bridges = getBridgeClasses(propsURI);
for (Class<? extends IBridge> bridgeClass : bridges) {
mapbinder.addBinding(bridgeClass).to(bridgeClass).in(Scopes.SINGLETON);
}
}
/*
* Get the bridge classes from the configuration file
*/
private List<Class<? extends IBridge>> getBridgeClasses(
String bridgePropFileName) {
List<Class<? extends IBridge>> aBList = new ArrayList<Class<? extends IBridge>>();
PropertiesConfiguration config = new PropertiesConfiguration();
try {
LOG.info("Loading : Active Bridge List");
config.load(bridgePropFileName);
String[] activeBridgeList = ((String) config
.getProperty("BridgeManager.activeBridges")).split(",");
LOG.info("Loaded : Active Bridge List");
for (String s : activeBridgeList) {
Class<? extends IBridge> bridgeCls = (Class<? extends IBridge>) Class
.forName(s);
aBList.add(bridgeCls);
}
} catch (ConfigurationException | IllegalArgumentException
| SecurityException | ClassNotFoundException e) {
LOG.error(e.getMessage(), e);
e.printStackTrace();
}
return aBList;
}
}
/**
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
* <p/>
* http://www.apache.org/licenses/LICENSE-2.0
* <p/>
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.atlas.web.resources;
import org.apache.atlas.bridge.hivelineage.HiveLineageBridge;
import javax.inject.Singleton;
//@Path("bridge/hive")
@Singleton
public class HiveLineageResource {
private final HiveLineageBridge bridge = null;
/*
//@Inject
public HiveLineageResource(HiveLineageBridge bridge) {
this.bridge = bridge;
}
//@Inject
public HiveLineageResource(Map<Class<? extends IBridge>, IBridge> bridges) {
this.bridge = (HiveLineageBridge) bridges.get(HiveLineageBridge.class);
}
@GET
@Path("/{id}")
@Produces(MediaType.APPLICATION_JSON)
public JsonElement getById(@PathParam("id") String id) throws RepositoryException {
// get the lineage bean
HiveLineage hlb = (HiveLineage) bridge.get(id);
// turn it into a JsonTree & return
return new Gson().toJsonTree(hlb);
}
@GET
@Produces(MediaType.APPLICATION_JSON)
public JsonElement list() throws RepositoryException {
// make a new JsonArray to be returned
JsonArray ja = new JsonArray();
// iterate over each item returned by the hive bridge's list() method
for (String s: bridge.list()) {
// they are GUIDs so make them into JsonPrimitives
ja.add(new JsonPrimitive(s));
}
return ja;
}
@POST
@Consumes(MediaType.APPLICATION_JSON)
@Produces(MediaType.APPLICATION_JSON)
public JsonElement addLineage(@Context HttpServletRequest request)
throws IOException, MetadataException {
// create a reader
try (Reader reader = new InputStreamReader(request.getInputStream())) {
// deserialize
HiveLineage bean = new Gson().fromJson(reader, HiveLineage.class);
String id = bridge.create(bean);
JsonObject jo = new JsonObject();
jo.addProperty("id", id);
return jo;
}
}
*/
}
#
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
#BridgeManager.activebridges denotes which bridge defintions to load from the classpath (Comma seperated list of fully qualified class paths)
#
BridgeManager.activeBridges=org.apache.atlas.bridge.hivelineage.HiveLineageBridge
\ No newline at end of file
/**
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
* <p/>
* http://www.apache.org/licenses/LICENSE-2.0
* <p/>
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.atlas.bridge;
import org.apache.atlas.RepositoryMetadataModule;
import org.apache.atlas.repository.MetadataRepository;
import org.testng.Assert;
import org.testng.annotations.Guice;
import org.testng.annotations.Test;
import javax.inject.Inject;
@Guice(modules = RepositoryMetadataModule.class)
public class BridgeManagerTest {
@Inject
MetadataRepository repo;
@Test(enabled = false)
public void testLoadPropertiesFile() throws Exception {
BridgeManager bm = new BridgeManager(repo);
System.out.println(bm.getActiveBridges().size());
Assert.assertEquals(bm.activeBridges.get(0).getClass().getSimpleName(),
"HiveLineageBridge");
}
@Test
public void testBeanConvertion() {
//Tests Conversion of Bean to Type
}
@Test
public void testIRefConvertion() {
//Tests Conversion of IRef cast to Bean
}
}
/**
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
* <p/>
* http://www.apache.org/licenses/LICENSE-2.0
* <p/>
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.atlas.bridge;
import org.apache.atlas.bridge.module.BridgeModule;
import org.testng.Assert;
import org.testng.annotations.Guice;
import org.testng.annotations.Test;
@Guice(modules = {BridgeModule.class})
public class TestBridgeModule {
@Test
public void loadAnything() {
// if it makes it here, the BridgeModule loaded successfully
Assert.assertTrue(true);
}
}
/**
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
* <p/>
* http://www.apache.org/licenses/LICENSE-2.0
* <p/>
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.atlas.bridge;
public class TestGenericBridges {
//TODO Build Generic Tests for non-lineage Bridge
}
/**
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
* <p/>
* http://www.apache.org/licenses/LICENSE-2.0
* <p/>
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.atlas.bridge.hivelineage;
import com.google.gson.Gson;
import org.apache.atlas.MetadataException;
import org.apache.atlas.bridge.BridgeTypeBootstrapper;
import org.apache.atlas.bridge.hivelineage.hook.HiveLineage;
import org.apache.atlas.bridge.module.BridgeModule;
import org.apache.atlas.repository.RepositoryException;
import org.apache.commons.collections.IteratorUtils;
import org.testng.Assert;
import org.testng.annotations.BeforeClass;
import org.testng.annotations.Guice;
import org.testng.annotations.Test;
import javax.inject.Inject;
import java.io.IOException;
import java.io.InputStreamReader;
import java.util.List;
@Guice(modules = {BridgeModule.class})
public class TestHiveLineageBridge {
@Inject
HiveLineageBridge bridge;
@Inject
BridgeTypeBootstrapper bootstrapper;
HiveLineage hlb;
// the id of one.json in the repo (test #1)
String oneId;
private HiveLineage loadHiveLineageBean(String path) throws IOException {
return new Gson().fromJson(new InputStreamReader(this.getClass().getResourceAsStream(path)),
HiveLineage.class);
}
@BeforeClass
public void bootstrap() throws IOException, MetadataException {
bootstrapper.bootstrap();
hlb = loadHiveLineageBean("/one.json");
}
@Test(priority = 1, enabled = false)
public void testCreate() throws MetadataException {
// add the lineage bean to the repo
oneId = bridge.create(hlb);
// make sure this actually did worked
Assert.assertNotNull(oneId);
}
@Test(priority = 2, enabled = false)
public void testGet() throws RepositoryException, IOException {
Object bean = bridge.get(oneId);
Assert.assertEquals(hlb, bean);
}
@Test(priority = 3, enabled = false)
public void testList() throws RepositoryException {
List<String> list = IteratorUtils.toList(bridge.list().iterator());
Assert.assertEquals(list.size(), 1);
Assert.assertEquals(list.get(0), oneId);
}
}
#
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
#BridgeManager.activebridges denotes which bridge defintions to load from the classpath (Comma seperated list of fully qualified class paths)
#
BridgeManager.activeBridges=org.apache.atlas.bridge.hivelineage.HiveLineageBridge
\ No newline at end of file
#
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
storage.backend=inmemory
# Graph Search Index
index.search.backend=elasticsearch
index.search.directory=target/data/es
index.search.elasticsearch.client-only=false
index.search.elasticsearch.local-mode=true
\ No newline at end of file
{
"queryId": "a760104_20150106120303_036186d5-a991-4dfc-9ff2-05b072c7e711",
"hiveId": "90797386-3933-4ab0-ae68-a7baa7e155d4",
"user": "",
"queryStartTime": "1420563838114",
"queryEndTime": "1420563853806",
"query": "create table nyse_gss_count_dump as select count(nyse.stock_symbol) stock_symbol_count, stock_symbol from nyse_stocks nyse where (nyse.stock_symbol \u003d \u0027AET\u0027 or nyse.stock_symbol \u003d \u0027UNH\u0027 ) and nyse.stock_symbol \u003d \u0027T\u0027 GROUP by stock_symbol",
"tableName": "nyse_gss_count_dump",
"success": true,
"failed": false,
"executionEngine": "tez",
"sourceTables": [
{
"tableName": "nyse_stocks",
"tableAlias": "nyse"
}
],
"queryColumns": [
{
"tbAliasOrName": "nyse",
"columnName": "stock_symbol",
"columnAlias": "stock_symbol_count",
"columnFunction": "count"
},
{"columnName": "stock_symbol"}
],
"whereClause": [
{
"tbAliasOrName": "nyse",
"columnName": "stock_symbol",
"columnOperator": "\u003d",
"columnValue": "\u0027AET\u0027"
},
{
"tbAliasOrName": "nyse",
"columnName": "stock_symbol",
"columnOperator": "\u003d",
"columnValue": "\u0027UNH\u0027"
},
{
"tbAliasOrName": "nyse",
"columnName": "stock_symbol",
"columnOperator": "\u003d",
"columnValue": "\u0027T\u0027"
}
],
"groupBy": [{"columnName": "stock_symbol"}]
}
\ No newline at end of file
#
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
#BridgeManager.activebridges denotes which bridge defintions to load from the classpath (Comma seperated list of fully qualified class paths)
#
BridgeManager.activeBridges=org.apache.atlas.bridge.HiveLineage
\ No newline at end of file
{
"queryId": "a760104_20150108124747_53cb7716-8756-4dfe-b746-4055f53e2895",
"hiveId": "1aebd95c-c7d5-4893-8c8c-c9ae098bdd5c",
"user": "",
"queryStartTime": "1420739257453",
"queryEndTime": "1420739277589",
"query": "create table nyse_gss_count_dump as select count(nyse.stock_symbol) stock_symbol_count, stock_symbol from nyse_stocks nyse where (nyse.stock_symbol \u003d \u0027AET\u0027 or nyse.stock_symbol \u003d \u0027UNH\u0027 ) and nyse.stock_symbol \u003d \u0027T\u0027 GROUP by stock_symbol",
"tableName": "nyse_gss_count_dump",
"success": true,
"failed": false,
"executionEngine": "tez",
"sourceTables": [
{
"tableName": "nyse_stocks",
"tableAlias": "nyse"
}
],
"queryColumns": [
{
"tbAliasOrName": "nyse",
"columnName": "stock_symbol",
"columnAlias": "stock_symbol_count",
"columnFunction": "count"
},
{"columnName": "stock_symbol"}
],
"whereClause": [
{
"tbAliasOrName": "nyse",
"columnName": "stock_symbol",
"columnOperator": "\u003d",
"columnValue": "\u0027AET\u0027"
},
{
"tbAliasOrName": "nyse",
"columnName": "stock_symbol",
"columnOperator": "\u003d",
"columnValue": "\u0027UNH\u0027"
},
{
"tbAliasOrName": "nyse",
"columnName": "stock_symbol",
"columnOperator": "\u003d",
"columnValue": "\u0027T\u0027"
}
],
"groupBy": [{"columnName": "stock_symbol"}]
}
\ No newline at end of file
<!--
~ Licensed to the Apache Software Foundation (ASF) under one
~ or more contributor license agreements. See the NOTICE file
~ distributed with this work for additional information
~ regarding copyright ownership. The ASF licenses this file
~ to you under the Apache License, Version 2.0 (the
~ "License"); you may not use this file except in compliance
~ with the License. You may obtain a copy of the License at
~
~ http://www.apache.org/licenses/LICENSE-2.0
~
~ Unless required by applicable law or agreed to in writing, software
~ distributed under the License is distributed on an "AS IS" BASIS,
~ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
~ See the License for the specific language governing permissions and
~ limitations under the License.
-->
<project xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns="http://maven.apache.org/POM/4.0.0"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.apache.atlas</groupId>
<artifactId>atlas-bridge-parent</artifactId>
<version>0.1-incubating-SNAPSHOT</version>
</parent>
<artifactId>atlas-bridge-hive</artifactId>
<packaging>jar</packaging>
<dependencies>
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>2.2.2</version>
</dependency>
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-common</artifactId>
<version>0.13.1</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.4.0</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-exec</artifactId>
<version>0.13.1</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.testng</groupId>
<artifactId>testng</artifactId>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<source>1.6</source>
<target>1.6</target>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>2.4</version>
</plugin>
<!--
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<executions>
<execution>
<id>copy</id>
<phase>prepare-package</phase>
<goals>
<goal>copy</goal>
</goals>
<configuration>
<artifactItems>
<artifactItem>
<groupId>org.apache.atlas</groupId>
<artifactId>atlas-common</artifactId>
<version>0.1-incubating-SNAPSHOT</version>
<outputDirectory>${project.build.directory}</outputDirectory>
</artifactItem>
</artifactItems>
</configuration>
</execution>
</executions>
</plugin>
-->
</plugins>
</build>
</project>
\ No newline at end of file
/**
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
* <p/>
* http://www.apache.org/licenses/LICENSE-2.0
* <p/>
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.atlas.bridge.hivelineage.hook;
import com.google.gson.Gson;
import org.apache.hadoop.hive.ql.parse.ParseException;
import org.apache.hadoop.hive.ql.parse.SemanticException;
public class HiveLineageInfoTest {
public static String parseQuery(String query) throws SemanticException,
ParseException {
HiveLineageInfo lep = new HiveLineageInfo();
lep.getLineageInfo(query);
Gson gson = new Gson();
String jsonOut = gson.toJson(lep.getHLBean());
return jsonOut;
}
}
......@@ -86,7 +86,8 @@ public class AtlasClient {
try {
clientConfig = getClientProperties();
if (clientConfig.getBoolean(TLS_ENABLED, false)) {
// create an SSL properties configuration if one doesn't exist. SSLFactory expects a file, so forced to create a
// create an SSL properties configuration if one doesn't exist. SSLFactory expects a file, so forced
// to create a
// configuration object, persist it, then subsequently pass in an empty configuration to SSLFactory
SecureClientUtils.persistSSLClientConfiguration(clientConfig);
}
......@@ -246,12 +247,12 @@ public class AtlasClient {
* @return result json object
* @throws AtlasServiceException
*/
public JSONArray rawSearch(String typeName, String attributeName, Object attributeValue) throws
AtlasServiceException {
// String gremlinQuery = String.format(
// "g.V.has(\"typeName\",\"%s\").and(_().has(\"%s.%s\", T.eq, \"%s\")).toList()",
// typeName, typeName, attributeName, attributeValue);
// return searchByGremlin(gremlinQuery);
public JSONArray rawSearch(String typeName, String attributeName, Object attributeValue)
throws AtlasServiceException {
// String gremlinQuery = String.format(
// "g.V.has(\"typeName\",\"%s\").and(_().has(\"%s.%s\", T.eq, \"%s\")).toList()",
// typeName, typeName, attributeName, attributeValue);
// return searchByGremlin(gremlinQuery);
String dslQuery = String.format("%s where %s = \"%s\"", typeName, attributeName, attributeValue);
return searchByDSL(dslQuery);
}
......@@ -341,13 +342,11 @@ public class AtlasClient {
private JSONObject callAPIWithResource(API api, WebResource resource, Object requestObject)
throws AtlasServiceException {
ClientResponse clientResponse = resource
.accept(JSON_MEDIA_TYPE)
.type(JSON_MEDIA_TYPE)
ClientResponse clientResponse = resource.accept(JSON_MEDIA_TYPE).type(JSON_MEDIA_TYPE)
.method(api.getMethod(), ClientResponse.class, requestObject);
Response.Status expectedStatus = HttpMethod.POST.equals(api.getMethod())
? Response.Status.CREATED : Response.Status.OK;
Response.Status expectedStatus =
HttpMethod.POST.equals(api.getMethod()) ? Response.Status.CREATED : Response.Status.OK;
if (clientResponse.getStatus() == expectedStatus.getStatusCode()) {
String responseAsString = clientResponse.getEntity(String.class);
try {
......@@ -360,8 +359,7 @@ public class AtlasClient {
throw new AtlasServiceException(api, clientResponse);
}
private JSONObject callAPI(API api, Object requestObject,
String... pathParams) throws AtlasServiceException {
private JSONObject callAPI(API api, Object requestObject, String... pathParams) throws AtlasServiceException {
WebResource resource = getResource(api, pathParams);
return callAPIWithResource(api, resource, requestObject);
}
......
......@@ -62,9 +62,7 @@ public class SecureClientUtils {
public static URLConnectionClientHandler getClientConnectionHandler(DefaultClientConfig config,
PropertiesConfiguration clientConfig) {
config.getProperties().put(
URLConnectionClientHandler.PROPERTY_HTTP_URL_CONNECTION_SET_METHOD_WORKAROUND,
true);
config.getProperties().put(URLConnectionClientHandler.PROPERTY_HTTP_URL_CONNECTION_SET_METHOD_WORKAROUND, true);
Configuration conf = new Configuration(false);
conf.addResource(conf.get(SSLFactory.SSL_CLIENT_CONF_KEY, "ssl-client.xml"));
String authType = "simple";
......@@ -95,11 +93,9 @@ public class SecureClientUtils {
return new URLConnectionClientHandler(httpURLConnectionFactory);
}
private final static ConnectionConfigurator DEFAULT_TIMEOUT_CONN_CONFIGURATOR =
new ConnectionConfigurator() {
private final static ConnectionConfigurator DEFAULT_TIMEOUT_CONN_CONFIGURATOR = new ConnectionConfigurator() {
@Override
public HttpURLConnection configure(HttpURLConnection conn)
throws IOException {
public HttpURLConnection configure(HttpURLConnection conn) throws IOException {
setTimeouts(conn, DEFAULT_SOCKET_TIMEOUT);
return conn;
}
......@@ -109,14 +105,13 @@ public class SecureClientUtils {
try {
return newSslConnConfigurator(DEFAULT_SOCKET_TIMEOUT, conf);
} catch (Exception e) {
LOG.debug("Cannot load customized ssl related configuration. " +
"Fallback to system-generic settings.", e);
LOG.debug("Cannot load customized ssl related configuration. " + "Fallback to system-generic settings.", e);
return DEFAULT_TIMEOUT_CONN_CONFIGURATOR;
}
}
private static ConnectionConfigurator newSslConnConfigurator(final int timeout,
Configuration conf) throws IOException, GeneralSecurityException {
private static ConnectionConfigurator newSslConnConfigurator(final int timeout, Configuration conf)
throws IOException, GeneralSecurityException {
final SSLFactory factory;
final SSLSocketFactory sf;
final HostnameVerifier hv;
......@@ -128,8 +123,7 @@ public class SecureClientUtils {
return new ConnectionConfigurator() {
@Override
public HttpURLConnection configure(HttpURLConnection conn)
throws IOException {
public HttpURLConnection configure(HttpURLConnection conn) throws IOException {
if (conn instanceof HttpsURLConnection) {
HttpsURLConnection c = (HttpsURLConnection) conn;
c.setSSLSocketFactory(sf);
......@@ -168,7 +162,8 @@ public class SecureClientUtils {
return new File(sslDir, SecurityProperties.SSL_CLIENT_PROPERTIES);
}
public static void persistSSLClientConfiguration(PropertiesConfiguration clientConfig) throws AtlasException, IOException {
public static void persistSSLClientConfiguration(PropertiesConfiguration clientConfig)
throws AtlasException, IOException {
//trust settings
Configuration configuration = new Configuration(false);
File sslClientFile = getSSLClientFile();
......
......@@ -36,18 +36,10 @@ import java.util.Properties;
*
*/
public class BaseSecurityTest {
private static final String JAAS_ENTRY =
"%s { \n"
+ " %s required\n"
private static final String JAAS_ENTRY = "%s { \n" + " %s required\n"
// kerberos module
+ " keyTab=\"%s\"\n"
+ " debug=true\n"
+ " principal=\"%s\"\n"
+ " useKeyTab=true\n"
+ " useTicketCache=false\n"
+ " doNotPrompt=true\n"
+ " storeKey=true;\n"
+ "}; \n";
+ " keyTab=\"%s\"\n" + " debug=true\n" + " principal=\"%s\"\n" + " useKeyTab=true\n"
+ " useTicketCache=false\n" + " doNotPrompt=true\n" + " storeKey=true;\n" + "}; \n";
protected MiniKdc kdc;
protected String getWarPath() {
......@@ -56,8 +48,8 @@ public class BaseSecurityTest {
}
protected void generateTestProperties(Properties props) throws ConfigurationException, IOException {
PropertiesConfiguration config = new PropertiesConfiguration(System.getProperty("user.dir") +
"/../src/conf/application.properties");
PropertiesConfiguration config =
new PropertiesConfiguration(System.getProperty("user.dir") + "/../src/conf/application.properties");
for (String propName : props.stringPropertyNames()) {
config.setProperty(propName, props.getProperty(propName));
}
......@@ -88,20 +80,11 @@ public class BaseSecurityTest {
return kdcWorkDir;
}
public String createJAASEntry(
String context,
String principal,
File keytab) {
public String createJAASEntry(String context, String principal, File keytab) {
String keytabpath = keytab.getAbsolutePath();
// fix up for windows; no-op on unix
keytabpath = keytabpath.replace('\\', '/');
return String.format(
Locale.ENGLISH,
JAAS_ENTRY,
context,
getKerberosAuthModuleForJVM(),
keytabpath,
principal);
return String.format(Locale.ENGLISH, JAAS_ENTRY, context, getKerberosAuthModuleForJVM(), keytabpath, principal);
}
protected String getKerberosAuthModuleForJVM() {
......@@ -119,10 +102,7 @@ public class BaseSecurityTest {
protected File createKeytab(MiniKdc kdc, File kdcWorkDir, String principal, String filename) throws Exception {
File keytab = new File(kdcWorkDir, filename);
kdc.createPrincipal(keytab,
principal,
principal + "/localhost",
principal + "/127.0.0.1");
kdc.createPrincipal(keytab, principal, principal + "/localhost", principal + "/127.0.0.1");
return keytab;
}
}
<!DOCTYPE html>
<!--
~ Licensed to the Apache Software Foundation (ASF) under one
~ or more contributor license agreements. See the NOTICE file
~ distributed with this work for additional information
~ regarding copyright ownership. The ASF licenses this file
~ to you under the Apache License, Version 2.0 (the
~ "License"); you may not use this file except in compliance
~ with the License. You may obtain a copy of the License at
~
~ http://www.apache.org/licenses/LICENSE-2.0
~
~ Unless required by applicable law or agreed to in writing, software
~ distributed under the License is distributed on an "AS IS" BASIS,
~ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
~ See the License for the specific language governing permissions and
~ limitations under the License.
-->
<html ng-app="DgcApp" lang="en">
<head>
<meta charset="utf-8">
<meta http-equiv="X-UA-Compatible" content="IE=edge">
<meta name="viewport" content="width=device-width, initial-scale=1">
<meta name="description" content="">
<meta name="author" content="">
<title ng-bind="windowTitle">search</title>
<!-- Bootstrap Core CSS -->
<link href="css/bootstrap.min.css" rel="stylesheet">
<!-- Custom CSS -->
<link href="css/heroic-features.css" rel="stylesheet">
<link href="css/sticky-footer-navbar.css" rel="stylesheet">
<link href="css/style.css" rel="stylesheet">
<!-- HTML5 Shim and Respond.js IE8 support of HTML5 elements and media queries -->
<!-- WARNING: Respond.js doesn't work if you view the page via file:// -->
<!--[if lt IE 9]>
<script src="https://oss.maxcdn.com/libs/html5shiv/3.7.0/html5shiv.js"></script>
<script src="https://oss.maxcdn.com/libs/respond.js/1.4.2/respond.min.js"></script>
<![endif]-->
<script src="lib/Angular/angular.min.js" type="text/javascript" ></script>
<script src="lib/Angular/angular-route.min.js" type="text/javascript" ></script>
<script src="http://angular-ui.github.io/bootstrap/ui-bootstrap-tpls-0.6.0.js" type="text/javascript"></script>
<script type="text/javascript" src="http://d3js.org/d3.v3.min.js"></script>
<script src="http://labratrevenge.com/d3-tip/javascripts/d3.tip.v0.6.3.js"></script>
<script src="js/app.js" type="text/javascript" ></script>
<script src="js/controllers.js" type="text/javascript" ></script>
<style type="text/css">
.node {
font: 15px sans-serif;
opacity: 1.5;
stroke-width: 0.5px;
}
.node a{
text-decoration:none;
}
.link {
stroke: #23A410;
stroke-opacity: 1.6;
}
#suit{
stroke: #23A410;
background: #23A410;
}
path.link {
fill: none;
stroke: #23A410;
stroke-width: 4px;
cursor: default;
}
svg:not(.active):not(.ctrl) path.link {
cursor: pointer;
}
path.link.selected {
stroke-dasharray: 10,2;
}
path.link.dragline {
pointer-events: none;
}
path.link.hidden {
stroke-width: 0;
}
.navbar-bottom{
background-color:#fafafa;
border-top:solid 4px #5cbb5a;
position:absolute;
width:100%;
height:60px;
}
</style>
</head>
<body>
<!-- Navigation -->
<nav class="navbar navbar-default navbar-fixed-top bg-top">
<div class="container">
<a href="index.html"> <h4 style="color:#333333; padding-left:15px; padding-top:5px"> <img src="img/logo-sm.png"></h4> </a>
</div>
</nav>
<div class="main" ng-view>
</div>
<footer class="navbar-bottom">
<div class="container">
<p class="txt2" align="right">Powered by<img src="img/HWLOGO-g.png"></p>
</div>
</footer>
</body>
</html>
This source diff could not be displayed because it is too large. You can view the blob instead.
This source diff could not be displayed because it is too large. You can view the blob instead.
This source diff could not be displayed because it is too large. You can view the blob instead.
/*!
* Start Bootstrap - Heroic Features HTML Template (http://startbootstrap.com)
* Code licensed under the Apache License v2.0.
* For details, see http://www.apache.org/licenses/LICENSE-2.0.
*/
body {
padding-top: 70px; /* Required padding for .navbar-fixed-top. Remove if using .navbar-static-top. Change if height of navigation changes. */
}
.hero-spacer {
margin-top: 50px;
}
.hero-feature {
margin-bottom: 30px;
}
footer {
margin: 50px 0;
}
\ No newline at end of file
/* Sticky footer styles
-------------------------------------------------- */
html {
position: relative;
min-height: 100%;
}
body {
/* Margin bottom by footer height */
margin-bottom: 60px;
}
.footer {
position: absolute;
bottom: 0;
width: 100%;
/* Set the fixed height of the footer here */
height: 60px;
background-color: #f5f5f5;
}
/* Custom page CSS
-------------------------------------------------- */
/* Not required for template or sticky footer method. */
body > .container {
padding: 60px 15px 0;
}
.container .text-muted {
margin: 20px 0;
}
.footer > .container {
padding-right: 15px;
padding-left: 15px;
}
code {
font-size: 80%;
}
/*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
.bg-top2{
background: #eeeded; /* for non-css3 browsers */
filter: progid:DXImageTransform.Microsoft.gradient(startColorstr='#eeeded', endColorstr='#e5e5e4'); /* for IE */
background: -webkit-gradient(linear, left top, left bottom, from(#eeeded), to(#e5e5e4)); /* for webkit browsers */
background: -moz-linear-gradient(top, #eeeded, #e5e5e4); /* for firefox 3.6+ */
width:100%;
border-bottom:solid 1px #d9d9d8;
margin-bottom:20px;
}
.bg-top{
background: #fafafa; /* for non-css3 browsers */
border-bottom:solid 4px #5cbb5a;
}
.bg-bottom{
background: #fafafa; /* for non-css3 browsers */
border-top:solid 4px #5cbb5a;
}
.txt2{
color:#333333;
margin:10px;
}
.table-border{
border:solid 2px #d9d9d8;;
margin-left:30px;
background-color:#edecec;
}
.search_table{
border-top:solid 2px #5cbb5a;;
border-left:solid 2px #5cbb5a;;
border-bottom:solid 2px #5cbb5a;;
background-color:#f4f4f4;
color:#333333;
}
.serach_btn{
border:solid 1px #686867;;
}
.txt1{
color:#5cbb5a;
font-size:23px;
}
.hrbg{
background-color:#686867;
border:solid 1px #686867;
}
/* unvisited link */
.searchbtn a:link {
background-color: #bdbdbd;
}
/* visited link */
.searchbtn a:visited {
background-color: #bdbdbd;
}
/* mouse over link */
.searchbtn a:hover {
background-color: #6abd45;
}
/* selected link */
.searchbtn a:active {
background-color: #bdbdbd;
}
<!DOCTYPE html>
<!--
~ Licensed to the Apache Software Foundation (ASF) under one
~ or more contributor license agreements. See the NOTICE file
~ distributed with this work for additional information
~ regarding copyright ownership. The ASF licenses this file
~ to you under the Apache License, Version 2.0 (the
~ "License"); you may not use this file except in compliance
~ with the License. You may obtain a copy of the License at
~
~ http://www.apache.org/licenses/LICENSE-2.0
~
~ Unless required by applicable law or agreed to in writing, software
~ distributed under the License is distributed on an "AS IS" BASIS,
~ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
~ See the License for the specific language governing permissions and
~ limitations under the License.
-->
<html lang="en">
<head>
<meta charset="utf-8">
<meta http-equiv="X-UA-Compatible" content="IE=edge">
<meta name="viewport" content="width=device-width, initial-scale=1">
<meta name="description" content="">
<meta name="author" content="">
<link rel="icon" href="">
<title>DGI | aetna</title>
<!-- Bootstrap core CSS -->
<link href="css/bootstrap.min.css" rel="stylesheet">
<!-- Custom styles for this template -->
<link href="css/sticky-footer-navbar.css" rel="stylesheet">
<link href="css/style.css" rel="stylesheet">
<!-- HTML5 shim and Respond.js for IE8 support of HTML5 elements and media queries -->
<!--[if lt IE 9]>
<script src="https://oss.maxcdn.com/html5shiv/3.7.2/html5shiv.min.js"></script>
<script src="https://oss.maxcdn.com/respond/1.4.2/respond.min.js"></script>
<![endif]-->
</head>
<body>
<!-- Fixed navbar -->
<nav class="navbar navbar-default navbar-fixed-top bg-top">
<div class="container" >
<div class="row">
<a href="index.html"> <h4 style="color:#333333; padding-left:15px; padding-top:5px"> <img src="img/logo-sm.png"></h4> </a>
</div>
</nav>
<!-- Begin page content -->
<a href="Search.html">
<div style="margin-top:40px; width:100%" align="center" >
<img src="img/splash2.png" class="img-responsive">
</div></div>
<!--footer-->
</a>
<footer class="footer bg-bottom">
<div class="container">
<p class="txt2" align="right">Powered by<img src="img/HWLOGO-g.png"></p>
</div>
</footer>
<!-- Bootstrap core JavaScript
================================================== -->
<!-- Placed at the end of the document so the pages load faster -->
<script src="https://ajax.googleapis.com/ajax/libs/jquery/1.11.2/jquery.min.js"></script>
<script src="js/bootstrap.min.js"></script>
</body>
</html>
<h4>{{key}}:</h4>
<p>{{val}}</p>
// This file is autogenerated via the `commonjs` Grunt task. You can require() this file in a CommonJS environment.
require('../../js/transition.js')
require('../../js/alert.js')
require('../../js/button.js')
require('../../js/carousel.js')
require('../../js/collapse.js')
require('../../js/dropdown.js')
require('../../js/modal.js')
require('../../js/tooltip.js')
require('../../js/popover.js')
require('../../js/scrollspy.js')
require('../../js/tab.js')
require('../../js/affix.js')
\ No newline at end of file
/*
AngularJS v1.2.18
(c) 2010-2014 Google, Inc. http://angularjs.org
License: MIT
*/
(function(n,e,A){'use strict';function x(s,g,k){return{restrict:"ECA",terminal:!0,priority:400,transclude:"element",link:function(a,c,b,f,w){function y(){p&&(p.remove(),p=null);h&&(h.$destroy(),h=null);l&&(k.leave(l,function(){p=null}),p=l,l=null)}function v(){var b=s.current&&s.current.locals;if(e.isDefined(b&&b.$template)){var b=a.$new(),d=s.current;l=w(b,function(d){k.enter(d,null,l||c,function(){!e.isDefined(t)||t&&!a.$eval(t)||g()});y()});h=d.scope=b;h.$emit("$viewContentLoaded");h.$eval(u)}else y()}
var h,l,p,t=b.autoscroll,u=b.onload||"";a.$on("$routeChangeSuccess",v);v()}}}function z(e,g,k){return{restrict:"ECA",priority:-400,link:function(a,c){var b=k.current,f=b.locals;c.html(f.$template);var w=e(c.contents());b.controller&&(f.$scope=a,f=g(b.controller,f),b.controllerAs&&(a[b.controllerAs]=f),c.data("$ngControllerController",f),c.children().data("$ngControllerController",f));w(a)}}}n=e.module("ngRoute",["ng"]).provider("$route",function(){function s(a,c){return e.extend(new (e.extend(function(){},
{prototype:a})),c)}function g(a,e){var b=e.caseInsensitiveMatch,f={originalPath:a,regexp:a},k=f.keys=[];a=a.replace(/([().])/g,"\\$1").replace(/(\/)?:(\w+)([\?\*])?/g,function(a,e,b,c){a="?"===c?c:null;c="*"===c?c:null;k.push({name:b,optional:!!a});e=e||"";return""+(a?"":e)+"(?:"+(a?e:"")+(c&&"(.+?)"||"([^/]+)")+(a||"")+")"+(a||"")}).replace(/([\/$\*])/g,"\\$1");f.regexp=RegExp("^"+a+"$",b?"i":"");return f}var k={};this.when=function(a,c){k[a]=e.extend({reloadOnSearch:!0},c,a&&g(a,c));if(a){var b=
"/"==a[a.length-1]?a.substr(0,a.length-1):a+"/";k[b]=e.extend({redirectTo:a},g(b,c))}return this};this.otherwise=function(a){this.when(null,a);return this};this.$get=["$rootScope","$location","$routeParams","$q","$injector","$http","$templateCache","$sce",function(a,c,b,f,g,n,v,h){function l(){var d=p(),m=r.current;if(d&&m&&d.$$route===m.$$route&&e.equals(d.pathParams,m.pathParams)&&!d.reloadOnSearch&&!u)m.params=d.params,e.copy(m.params,b),a.$broadcast("$routeUpdate",m);else if(d||m)u=!1,a.$broadcast("$routeChangeStart",
d,m),(r.current=d)&&d.redirectTo&&(e.isString(d.redirectTo)?c.path(t(d.redirectTo,d.params)).search(d.params).replace():c.url(d.redirectTo(d.pathParams,c.path(),c.search())).replace()),f.when(d).then(function(){if(d){var a=e.extend({},d.resolve),c,b;e.forEach(a,function(d,c){a[c]=e.isString(d)?g.get(d):g.invoke(d)});e.isDefined(c=d.template)?e.isFunction(c)&&(c=c(d.params)):e.isDefined(b=d.templateUrl)&&(e.isFunction(b)&&(b=b(d.params)),b=h.getTrustedResourceUrl(b),e.isDefined(b)&&(d.loadedTemplateUrl=
b,c=n.get(b,{cache:v}).then(function(a){return a.data})));e.isDefined(c)&&(a.$template=c);return f.all(a)}}).then(function(c){d==r.current&&(d&&(d.locals=c,e.copy(d.params,b)),a.$broadcast("$routeChangeSuccess",d,m))},function(c){d==r.current&&a.$broadcast("$routeChangeError",d,m,c)})}function p(){var a,b;e.forEach(k,function(f,k){var q;if(q=!b){var g=c.path();q=f.keys;var l={};if(f.regexp)if(g=f.regexp.exec(g)){for(var h=1,p=g.length;h<p;++h){var n=q[h-1],r="string"==typeof g[h]?decodeURIComponent(g[h]):
g[h];n&&r&&(l[n.name]=r)}q=l}else q=null;else q=null;q=a=q}q&&(b=s(f,{params:e.extend({},c.search(),a),pathParams:a}),b.$$route=f)});return b||k[null]&&s(k[null],{params:{},pathParams:{}})}function t(a,c){var b=[];e.forEach((a||"").split(":"),function(a,d){if(0===d)b.push(a);else{var e=a.match(/(\w+)(.*)/),f=e[1];b.push(c[f]);b.push(e[2]||"");delete c[f]}});return b.join("")}var u=!1,r={routes:k,reload:function(){u=!0;a.$evalAsync(l)}};a.$on("$locationChangeSuccess",l);return r}]});n.provider("$routeParams",
function(){this.$get=function(){return{}}});n.directive("ngView",x);n.directive("ngView",z);x.$inject=["$route","$anchorScroll","$animate"];z.$inject=["$compile","$controller","$route"]})(window,window.angular);
//# sourceMappingURL=angular-route.min.js.map
{
"version":3,
"file":"angular-route.min.js",
"lineCount":13,
"mappings":"A;;;;;aAKC,SAAQ,CAACA,CAAD,CAASC,CAAT,CAAkBC,CAAlB,CAA6B,CAmzBtCC,QAASA,EAAa,CAAIC,CAAJ,CAAcC,CAAd,CAA+BC,CAA/B,CAAyC,CAC7D,MAAO,UACK,KADL,UAEK,CAAA,CAFL,UAGK,GAHL,YAIO,SAJP,MAKCC,QAAQ,CAACC,CAAD,CAAQC,CAAR,CAAkBC,CAAlB,CAAwBC,CAAxB,CAA8BC,CAA9B,CAA2C,CASrDC,QAASA,EAAe,EAAG,CACrBC,CAAJ,GACEA,CAAAC,SAAA,EACA,CAAAD,CAAA,CAAe,IAFjB,CAIGE,EAAH,GACEV,CAAAW,MAAA,CAAeD,CAAf,CACA,CAAAA,CAAA,CAAiB,IAFnB,CALyB,CAW3BE,QAASA,EAAM,EAAG,CAAA,IACZC,EAASf,CAAAgB,QAATD,EAA2Bf,CAAAgB,QAAAD,OAG/B,IAAIlB,CAAAoB,UAAA,CAFWF,CAEX,EAFqBA,CAAAG,UAErB,CAAJ,CAAiC,CAC3BC,IAAAA,EAAWf,CAAAgB,KAAA,EAAXD,CACAH,EAAUhB,CAAAgB,QAkBdJ,EAAA,CAVYJ,CAAAa,CAAYF,CAAZE,CAAsB,QAAQ,CAACA,CAAD,CAAQ,CAChDnB,CAAAoB,MAAA,CAAeD,CAAf,CAAsB,IAAtB,CAA4BT,CAA5B,EAA8CP,CAA9C,CAAwDkB,QAAuB,EAAG,CAC5E,CAAA1B,CAAAoB,UAAA,CAAkBO,CAAlB,CAAJ,EACOA,CADP,EACwB,CAAApB,CAAAqB,MAAA,CAAYD,CAAZ,CADxB,EAEEvB,CAAA,EAH8E,CAAlF,CAMAQ,EAAA,EAPgD,CAAtCY,CAWZX,EAAA,CAAeM,CAAAZ,MAAf,CAA+Be,CAC/BT,EAAAgB,MAAA,CAAmB,oBAAnB,CACAhB,EAAAe,MAAA,CAAmBE,CAAnB,CAvB+B,CAAjC,IAyBElB,EAAA,EA7Bc,CApBmC,IACjDC,CADiD,CAEjDE,CAFiD,CAGjDY,EAAgBlB,CAAAsB,WAHiC,CAIjDD,EAAYrB,CAAAuB,OAAZF,EAA2B,EAE/BvB;CAAA0B,IAAA,CAAU,qBAAV,CAAiChB,CAAjC,CACAA,EAAA,EAPqD,CALpD,CADsD,CAoE/DiB,QAASA,EAAwB,CAACC,CAAD,CAAWC,CAAX,CAAwBjC,CAAxB,CAAgC,CAC/D,MAAO,UACK,KADL,UAEM,IAFN,MAGCG,QAAQ,CAACC,CAAD,CAAQC,CAAR,CAAkB,CAAA,IAC1BW,EAAUhB,CAAAgB,QADgB,CAE1BD,EAASC,CAAAD,OAEbV,EAAA6B,KAAA,CAAcnB,CAAAG,UAAd,CAEA,KAAIf,EAAO6B,CAAA,CAAS3B,CAAA8B,SAAA,EAAT,CAEPnB,EAAAoB,WAAJ,GACErB,CAAAsB,OAMA,CANgBjC,CAMhB,CALIgC,CAKJ,CALiBH,CAAA,CAAYjB,CAAAoB,WAAZ,CAAgCrB,CAAhC,CAKjB,CAJIC,CAAAsB,aAIJ,GAHElC,CAAA,CAAMY,CAAAsB,aAAN,CAGF,CAHgCF,CAGhC,EADA/B,CAAAkC,KAAA,CAAc,yBAAd,CAAyCH,CAAzC,CACA,CAAA/B,CAAAmC,SAAA,EAAAD,KAAA,CAAyB,yBAAzB,CAAoDH,CAApD,CAPF,CAUAjC,EAAA,CAAKC,CAAL,CAlB8B,CAH3B,CADwD,CAp2B7DqC,CAAAA,CAAgB5C,CAAA6C,OAAA,CAAe,SAAf,CAA0B,CAAC,IAAD,CAA1B,CAAAC,SAAA,CACa,QADb,CAkBpBC,QAAuB,EAAE,CACvBC,QAASA,EAAO,CAACC,CAAD,CAASC,CAAT,CAAgB,CAC9B,MAAOlD,EAAAmD,OAAA,CAAe,KAAKnD,CAAAmD,OAAA,CAAe,QAAQ,EAAG,EAA1B,CAA8B,WAAWF,CAAX,CAA9B,CAAL,CAAf,CAA0EC,CAA1E,CADuB,CA2IhCE,QAASA,EAAU,CAACC,CAAD;AAAOC,CAAP,CAAa,CAAA,IAC1BC,EAAcD,CAAAE,qBADY,CAE1BC,EAAM,cACUJ,CADV,QAEIA,CAFJ,CAFoB,CAM1BK,EAAOD,CAAAC,KAAPA,CAAkB,EAEtBL,EAAA,CAAOA,CAAAM,QAAA,CACI,UADJ,CACgB,MADhB,CAAAA,QAAA,CAEI,uBAFJ,CAE6B,QAAQ,CAACC,CAAD,CAAIC,CAAJ,CAAWC,CAAX,CAAgBC,CAAhB,CAAuB,CAC3DC,CAAAA,CAAsB,GAAX,GAAAD,CAAA,CAAiBA,CAAjB,CAA0B,IACrCE,EAAAA,CAAkB,GAAX,GAAAF,CAAA,CAAiBA,CAAjB,CAA0B,IACrCL,EAAAQ,KAAA,CAAU,MAAQJ,CAAR,UAAuB,CAAC,CAACE,CAAzB,CAAV,CACAH,EAAA,CAAQA,CAAR,EAAiB,EACjB,OAAO,EAAP,EACKG,CAAA,CAAW,EAAX,CAAgBH,CADrB,EAEI,KAFJ,EAGKG,CAAA,CAAWH,CAAX,CAAmB,EAHxB,GAIKI,CAJL,EAIa,OAJb,EAIwB,SAJxB,GAKKD,CALL,EAKiB,EALjB,EAMI,GANJ,EAOKA,CAPL,EAOiB,EAPjB,CAL+D,CAF5D,CAAAL,QAAA,CAgBI,YAhBJ,CAgBkB,MAhBlB,CAkBPF,EAAAU,OAAA,CAAiBC,MAAJ,CAAW,GAAX,CAAiBf,CAAjB,CAAwB,GAAxB,CAA6BE,CAAA,CAAc,GAAd,CAAoB,EAAjD,CACb,OAAOE,EA3BuB,CAvIhC,IAAIY,EAAS,EAsGb,KAAAC,KAAA,CAAYC,QAAQ,CAAClB,CAAD,CAAOmB,CAAP,CAAc,CAChCH,CAAA,CAAOhB,CAAP,CAAA,CAAerD,CAAAmD,OAAA,CACb,gBAAiB,CAAA,CAAjB,CADa,CAEbqB,CAFa,CAGbnB,CAHa,EAGLD,CAAA,CAAWC,CAAX,CAAiBmB,CAAjB,CAHK,CAOf,IAAInB,CAAJ,CAAU,CACR,IAAIoB,EAAuC,GACxB,EADCpB,CAAA,CAAKA,CAAAqB,OAAL,CAAiB,CAAjB,CACD,CAAXrB,CAAAsB,OAAA,CAAY,CAAZ,CAAetB,CAAAqB,OAAf;AAA2B,CAA3B,CAAW,CACXrB,CADW,CACL,GAEdgB,EAAA,CAAOI,CAAP,CAAA,CAAuBzE,CAAAmD,OAAA,CACrB,YAAaE,CAAb,CADqB,CAErBD,CAAA,CAAWqB,CAAX,CAAyBD,CAAzB,CAFqB,CALf,CAWV,MAAO,KAnByB,CA2ElC,KAAAI,UAAA,CAAiBC,QAAQ,CAACC,CAAD,CAAS,CAChC,IAAAR,KAAA,CAAU,IAAV,CAAgBQ,CAAhB,CACA,OAAO,KAFyB,CAMlC,KAAAC,KAAA,CAAY,CAAC,YAAD,CACC,WADD,CAEC,cAFD,CAGC,IAHD,CAIC,WAJD,CAKC,OALD,CAMC,gBAND,CAOC,MAPD,CAQR,QAAQ,CAACC,CAAD,CAAaC,CAAb,CAAwBC,CAAxB,CAAsCC,CAAtC,CAA0CC,CAA1C,CAAqDC,CAArD,CAA4DC,CAA5D,CAA4EC,CAA5E,CAAkF,CA4P5FC,QAASA,EAAW,EAAG,CAAA,IACjBC,EAAOC,CAAA,EADU,CAEjBC,EAAOxF,CAAAgB,QAEX,IAAIsE,CAAJ,EAAYE,CAAZ,EAAoBF,CAAAG,QAApB,GAAqCD,CAAAC,QAArC,EACO5F,CAAA6F,OAAA,CAAeJ,CAAAK,WAAf,CAAgCH,CAAAG,WAAhC,CADP,EAEO,CAACL,CAAAM,eAFR,EAE+B,CAACC,CAFhC,CAGEL,CAAAb,OAEA,CAFcW,CAAAX,OAEd,CADA9E,CAAAiG,KAAA,CAAaN,CAAAb,OAAb,CAA0BI,CAA1B,CACA,CAAAF,CAAAkB,WAAA,CAAsB,cAAtB,CAAsCP,CAAtC,CALF,KAMO,IAAIF,CAAJ,EAAYE,CAAZ,CACLK,CAeA,CAfc,CAAA,CAed,CAdAhB,CAAAkB,WAAA,CAAsB,mBAAtB,CAA2CT,CAA3C,CAAiDE,CAAjD,CAcA,EAbAxF,CAAAgB,QAaA;AAbiBsE,CAajB,GAXMA,CAAAU,WAWN,GAVQnG,CAAAoG,SAAA,CAAiBX,CAAAU,WAAjB,CAAJ,CACElB,CAAA5B,KAAA,CAAegD,CAAA,CAAYZ,CAAAU,WAAZ,CAA6BV,CAAAX,OAA7B,CAAf,CAAAwB,OAAA,CAAiEb,CAAAX,OAAjE,CAAAnB,QAAA,EADF,CAIEsB,CAAAsB,IAAA,CAAcd,CAAAU,WAAA,CAAgBV,CAAAK,WAAhB,CAAiCb,CAAA5B,KAAA,EAAjC,CAAmD4B,CAAAqB,OAAA,EAAnD,CAAd,CAAA3C,QAAA,EAMN,EAAAwB,CAAAb,KAAA,CAAQmB,CAAR,CAAAe,KAAA,CACO,QAAQ,EAAG,CACd,GAAIf,CAAJ,CAAU,CAAA,IACJvE,EAASlB,CAAAmD,OAAA,CAAe,EAAf,CAAmBsC,CAAAgB,QAAnB,CADL,CAEJC,CAFI,CAEMC,CAEd3G,EAAA4G,QAAA,CAAgB1F,CAAhB,CAAwB,QAAQ,CAAC2F,CAAD,CAAQ/C,CAAR,CAAa,CAC3C5C,CAAA,CAAO4C,CAAP,CAAA,CAAc9D,CAAAoG,SAAA,CAAiBS,CAAjB,CAAA,CACVzB,CAAA0B,IAAA,CAAcD,CAAd,CADU,CACazB,CAAA2B,OAAA,CAAiBF,CAAjB,CAFgB,CAA7C,CAKI7G,EAAAoB,UAAA,CAAkBsF,CAAlB,CAA6BjB,CAAAiB,SAA7B,CAAJ,CACM1G,CAAAgH,WAAA,CAAmBN,CAAnB,CADN,GAEIA,CAFJ,CAEeA,CAAA,CAASjB,CAAAX,OAAT,CAFf,EAIW9E,CAAAoB,UAAA,CAAkBuF,CAAlB,CAAgClB,CAAAkB,YAAhC,CAJX,GAKM3G,CAAAgH,WAAA,CAAmBL,CAAnB,CAIJ,GAHEA,CAGF,CAHgBA,CAAA,CAAYlB,CAAAX,OAAZ,CAGhB,EADA6B,CACA,CADcpB,CAAA0B,sBAAA,CAA2BN,CAA3B,CACd,CAAI3G,CAAAoB,UAAA,CAAkBuF,CAAlB,CAAJ,GACElB,CAAAyB,kBACA,CADyBP,CACzB,CAAAD,CAAA,CAAWrB,CAAAyB,IAAA,CAAUH,CAAV;AAAuB,OAAQrB,CAAR,CAAvB,CAAAkB,KAAA,CACF,QAAQ,CAACW,CAAD,CAAW,CAAE,MAAOA,EAAAzE,KAAT,CADjB,CAFb,CATF,CAeI1C,EAAAoB,UAAA,CAAkBsF,CAAlB,CAAJ,GACExF,CAAA,UADF,CACwBwF,CADxB,CAGA,OAAOvB,EAAAiC,IAAA,CAAOlG,CAAP,CA3BC,CADI,CADlB,CAAAsF,KAAA,CAiCO,QAAQ,CAACtF,CAAD,CAAS,CAChBuE,CAAJ,EAAYtF,CAAAgB,QAAZ,GACMsE,CAIJ,GAHEA,CAAAvE,OACA,CADcA,CACd,CAAAlB,CAAAiG,KAAA,CAAaR,CAAAX,OAAb,CAA0BI,CAA1B,CAEF,EAAAF,CAAAkB,WAAA,CAAsB,qBAAtB,CAA6CT,CAA7C,CAAmDE,CAAnD,CALF,CADoB,CAjCxB,CAyCK,QAAQ,CAAC0B,CAAD,CAAQ,CACb5B,CAAJ,EAAYtF,CAAAgB,QAAZ,EACE6D,CAAAkB,WAAA,CAAsB,mBAAtB,CAA2CT,CAA3C,CAAiDE,CAAjD,CAAuD0B,CAAvD,CAFe,CAzCrB,CA1BmB,CA+EvB3B,QAASA,EAAU,EAAG,CAAA,IAEhBZ,CAFgB,CAERwC,CACZtH,EAAA4G,QAAA,CAAgBvC,CAAhB,CAAwB,QAAQ,CAACG,CAAD,CAAQnB,CAAR,CAAc,CACxC,IAAA,CAAA,IAAA,CAAA,CAAA,CAAA,CAAA,CAAA,CAAW,IAAA,EAAA,CAAA,KAAA,EAzGbK,EAAAA,CAyGac,CAzGNd,KAAX,KACIoB,EAAS,EAEb,IAsGiBN,CAtGZL,OAAL,CAGA,GADIoD,CACJ,CAmGiB/C,CApGTL,OAAAqD,KAAA,CAAkBC,CAAlB,CACR,CAAA,CAEA,IATqC,IAS5BC,EAAI,CATwB,CASrBC,EAAMJ,CAAA7C,OAAtB,CAAgCgD,CAAhC,CAAoCC,CAApC,CAAyC,EAAED,CAA3C,CAA8C,CAC5C,IAAI5D,EAAMJ,CAAA,CAAKgE,CAAL,CAAS,CAAT,CAAV,CAEIE,EAAM,QACA,EADY,MAAOL,EAAA,CAAEG,CAAF,CACnB,CAAFG,kBAAA,CAAmBN,CAAA,CAAEG,CAAF,CAAnB,CAAE,CACFH,CAAA,CAAEG,CAAF,CAEJ5D;CAAJ,EAAW8D,CAAX,GACE9C,CAAA,CAAOhB,CAAAgE,KAAP,CADF,CACqBF,CADrB,CAP4C,CAW9C,CAAA,CAAO9C,CAbP,CAAA,IAAQ,EAAA,CAAO,IAHf,KAAmB,EAAA,CAAO,IAsGT,EAAA,CAAA,CAAA,CAAA,CAAX,CAAA,CAAJ,GACEwC,CAGA,CAHQtE,CAAA,CAAQwB,CAAR,CAAe,QACbxE,CAAAmD,OAAA,CAAe,EAAf,CAAmB8B,CAAAqB,OAAA,EAAnB,CAAuCxB,CAAvC,CADa,YAETA,CAFS,CAAf,CAGR,CAAAwC,CAAA1B,QAAA,CAAgBpB,CAJlB,CAD4C,CAA9C,CASA,OAAO8C,EAAP,EAAgBjD,CAAA,CAAO,IAAP,CAAhB,EAAgCrB,CAAA,CAAQqB,CAAA,CAAO,IAAP,CAAR,CAAsB,QAAS,EAAT,YAAwB,EAAxB,CAAtB,CAZZ,CAkBtBgC,QAASA,EAAW,CAAC0B,CAAD,CAASjD,CAAT,CAAiB,CACnC,IAAIkD,EAAS,EACbhI,EAAA4G,QAAA,CAAiBqB,CAAAF,CAAAE,EAAQ,EAARA,OAAA,CAAkB,GAAlB,CAAjB,CAAyC,QAAQ,CAACC,CAAD,CAAUR,CAAV,CAAa,CAC5D,GAAU,CAAV,GAAIA,CAAJ,CACEM,CAAA9D,KAAA,CAAYgE,CAAZ,CADF,KAEO,CACL,IAAIC,EAAeD,CAAAZ,MAAA,CAAc,WAAd,CAAnB,CACIxD,EAAMqE,CAAA,CAAa,CAAb,CACVH,EAAA9D,KAAA,CAAYY,CAAA,CAAOhB,CAAP,CAAZ,CACAkE,EAAA9D,KAAA,CAAYiE,CAAA,CAAa,CAAb,CAAZ,EAA+B,EAA/B,CACA,QAAOrD,CAAA,CAAOhB,CAAP,CALF,CAHqD,CAA9D,CAWA,OAAOkE,EAAAI,KAAA,CAAY,EAAZ,CAb4B,CA7VuD,IA8LxFpC,EAAc,CAAA,CA9L0E,CA+LxF7F,EAAS,QACCkE,CADD,QAeCgE,QAAQ,EAAG,CACjBrC,CAAA,CAAc,CAAA,CACdhB,EAAAsD,WAAA,CAAsB9C,CAAtB,CAFiB,CAfZ,CAqBbR,EAAA/C,IAAA,CAAe,wBAAf,CAAyCuD,CAAzC,CAEA,OAAOrF,EAtNqF,CARlF,CA5LW,CAlBL,CAqkBpByC,EAAAE,SAAA,CAAuB,cAAvB;AAoCAyF,QAA6B,EAAG,CAC9B,IAAAxD,KAAA,CAAYyD,QAAQ,EAAG,CAAE,MAAO,EAAT,CADO,CApChC,CAwCA5F,EAAA6F,UAAA,CAAwB,QAAxB,CAAkCvI,CAAlC,CACA0C,EAAA6F,UAAA,CAAwB,QAAxB,CAAkCvG,CAAlC,CAiLAhC,EAAAwI,QAAA,CAAwB,CAAC,QAAD,CAAW,eAAX,CAA4B,UAA5B,CAoExBxG,EAAAwG,QAAA,CAAmC,CAAC,UAAD,CAAa,aAAb,CAA4B,QAA5B,CAt3BG,CAArC,CAAA,CAm5BE3I,MAn5BF,CAm5BUA,MAAAC,QAn5BV;",
"sources":["angular-route.js"],
"names":["window","angular","undefined","ngViewFactory","$route","$anchorScroll","$animate","link","scope","$element","attr","ctrl","$transclude","cleanupLastView","currentScope","$destroy","currentElement","leave","update","locals","current","isDefined","$template","newScope","$new","clone","enter","onNgViewEnter","autoScrollExp","$eval","$emit","onloadExp","autoscroll","onload","$on","ngViewFillContentFactory","$compile","$controller","html","contents","controller","$scope","controllerAs","data","children","ngRouteModule","module","provider","$RouteProvider","inherit","parent","extra","extend","pathRegExp","path","opts","insensitive","caseInsensitiveMatch","ret","keys","replace","_","slash","key","option","optional","star","push","regexp","RegExp","routes","when","this.when","route","redirectPath","length","substr","otherwise","this.otherwise","params","$get","$rootScope","$location","$routeParams","$q","$injector","$http","$templateCache","$sce","updateRoute","next","parseRoute","last","$$route","equals","pathParams","reloadOnSearch","forceReload","copy","$broadcast","redirectTo","isString","interpolate","search","url","then","resolve","template","templateUrl","forEach","value","get","invoke","isFunction","getTrustedResourceUrl","loadedTemplateUrl","response","all","error","match","m","exec","on","i","len","val","decodeURIComponent","name","string","result","split","segment","segmentMatch","join","reload","$evalAsync","$RouteParamsProvider","this.$get","directive","$inject"]
}
This source diff could not be displayed because it is too large. You can view the blob instead.
This source diff could not be displayed because it is too large. You can view the blob instead.
<!--
~ Licensed to the Apache Software Foundation (ASF) under one
~ or more contributor license agreements. See the NOTICE file
~ distributed with this work for additional information
~ regarding copyright ownership. The ASF licenses this file
~ to you under the Apache License, Version 2.0 (the
~ "License"); you may not use this file except in compliance
~ with the License. You may obtain a copy of the License at
~
~ http://www.apache.org/licenses/LICENSE-2.0
~
~ Unless required by applicable law or agreed to in writing, software
~ distributed under the License is distributed on an "AS IS" BASIS,
~ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
~ See the License for the specific language governing permissions and
~ limitations under the License.
--> <!-- Page Content -->
<div style="margin-top:-20px;padding-top:20px; background-color:#eeeded;padding-bottom:10px;border-bottom:solid 1px #d9d9d8;">
<form name=form" class="container">
<div class="col-lg-7">
<div class="row input-group">
<input type="text" ng-model="query" ng-keyup="$event.keyCode == 13 && executeSearch()" class="form-control search_table" placeholder="Search">
<span class="input-group-btn">
<button class="btn btn-success" ng-click="executeSearch()" type="submit"><i class="glyphicon glyphicon-search white "></i></button>
</span>
</div>
<div class="row">
<small style="color:#999999; margin-top:2px;">property=HiveLineage.executionEngine&text=tez</small><br/>
<small style="color:#999999; margin-top:2px;">property=type&text=HiveLineage</small><br/>
<small style="color:#999999; margin-top:2px;">property=type&text=hive_table</small>
</div>
</div>
</form>
</div>
<div class="container" style="min-height:330px;">
<div class="row">
<div class="col-lg-11">
<div ng-hide="iswiki">
<input type="hidden" ng-model="iserror">
<h4 class="txt1" ng-hide="!matchingResults">{{matchingResults}} results matching your query "{{SearchQuery}}" were found"</h4>
<ul class="list-unstyled">
<li ng-hide="iserror" class="sm-txt1" ng-repeat="entity in entities"><u><a href="#Search/{{entity.guid}}" ng-click="StoreJson(entity)" style="line-height: 2.5;">{{ entity.guid}}</a></u>
</li>
<li ng-show="iserror" class="sm-txt1"></li>
</ul>
</ul>
</div>
<div ng-show="iswiki" data-ng-include="selectedDefination.path"></div>
</div>
</div>
</div>
{
"directory": "public/lib",
"directory": "dist/lib",
"storage": {
"packages": ".bower-cache",
"registry": ".bower-registry"
......
......@@ -23,3 +23,6 @@ public/lib
public/dist
*.log
*.tgz
node/
dist/
**/app.min.js
/*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
test/coverage/**
\ No newline at end of file
/*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
{
"browser": true, // Standard browser globals e.g. `window`, `document`.
"bitwise": false, // Prohibit bitwise operators (&, |, ^, etc.).
......
File mode changed from 100755 to 100644
File mode changed from 100755 to 100644
This source diff could not be displayed because it is too large. You can view the blob instead.
This source diff could not be displayed because it is too large. You can view the blob instead.
This source diff could not be displayed because it is too large. You can view the blob instead.
This source diff could not be displayed because it is too large. You can view the blob instead.
This source diff could not be displayed because it is too large. You can view the blob instead.
This source diff could not be displayed because it is too large. You can view the blob instead.
This source diff could not be displayed because it is too large. You can view the blob instead.
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment