Java Code Examples for org.apache.hadoop.hive.ql.metadata.Table#getCols()

The following examples show how to use org.apache.hadoop.hive.ql.metadata.Table#getCols() . You can vote up the ones you like or vote down the ones you don't like, and go to the original project or source file by following the links above each example. You may check out the related API usage on the sidebar.
Example 1
Source File: AlterTableRenameCol.java    From atlas with Apache License 2.0 5 votes vote down vote up
public static FieldSchema findRenamedColumn(Table inputTable, Table outputTable) {
    FieldSchema       ret           = null;
    List<FieldSchema> inputColumns  = inputTable.getCols();
    List<FieldSchema> outputColumns = outputTable.getCols();

    for (FieldSchema inputColumn : inputColumns) {
        if (!outputColumns.contains(inputColumn)) {
            ret = inputColumn;

            break;
        }
    }

    return ret;
}
 
Example 2
Source File: HiveCatalogUtil.java    From tajo with Apache License 2.0 5 votes vote down vote up
public static void validateSchema(Table tblSchema) {
  for (FieldSchema fieldSchema : tblSchema.getCols()) {
    String fieldType = fieldSchema.getType();
    if (fieldType.equalsIgnoreCase("ARRAY") || fieldType.equalsIgnoreCase("STRUCT")
      || fieldType.equalsIgnoreCase("MAP")) {
      throw new TajoRuntimeException(new UnsupportedException("data type '" + fieldType.toUpperCase() + "'"));
    }
  }
}
 
Example 3
Source File: SentryFilterDDLTask.java    From incubator-sentry with Apache License 2.0 5 votes vote down vote up
/**
 * Filter the command "show columns in table"
 *
 */
private int showFilterColumns(ShowColumnsDesc showCols) throws HiveException {
  Table table = Hive.get(conf).getTable(showCols.getTableName());

  // write the results in the file
  DataOutputStream outStream = null;
  try {
    Path resFile = new Path(showCols.getResFile());
    FileSystem fs = resFile.getFileSystem(conf);
    outStream = fs.create(resFile);

    List<FieldSchema> cols = table.getCols();
    cols.addAll(table.getPartCols());
    // In case the query is served by HiveServer2, don't pad it with spaces,
    // as HiveServer2 output is consumed by JDBC/ODBC clients.
    boolean isOutputPadded = !SessionState.get().isHiveServerQuery();
    outStream.writeBytes(MetaDataFormatUtils.getAllColumnsInformation(
        fiterColumns(cols, table), false, isOutputPadded, null));
    outStream.close();
    outStream = null;
  } catch (IOException e) {
    throw new HiveException(e, ErrorMsg.GENERIC_ERROR);
  } finally {
    IOUtils.closeStream(outStream);
  }
  return 0;
}