Java Code Examples for org.apache.hadoop.classification.InterfaceAudience

The following are top voted examples for showing how to use org.apache.hadoop.classification.InterfaceAudience. These examples are extracted from open source projects. You can vote up the examples you like and your votes will be used in our system to generate more good examples.
Example 1
Project: hadoop   File: SecurityUtil.java   Source Code and License 6 votes vote down vote up
/**
 * Convert Kerberos principal name pattern to valid Kerberos principal names.
 * This method is similar to {@link #getServerPrincipal(String, String)},
 * except 1) the reverse DNS lookup from addr to hostname is done only when
 * necessary, 2) param addr can't be null (no default behavior of using local
 * hostname when addr is null).
 * 
 * @param principalConfig
 *          Kerberos principal name pattern to convert
 * @param addr
 *          InetAddress of the host used for substitution
 * @return converted Kerberos principal name
 * @throws IOException if the client address cannot be determined
 */
@InterfaceAudience.Public
@InterfaceStability.Evolving
public static String getServerPrincipal(String principalConfig,
    InetAddress addr) throws IOException {
  String[] components = getComponents(principalConfig);
  if (components == null || components.length != 3
      || !components[1].equals(HOSTNAME_PATTERN)) {
    return principalConfig;
  } else {
    if (addr == null) {
      throw new IOException("Can't replace " + HOSTNAME_PATTERN
          + " pattern since client address is null");
    }
    return replacePattern(components, addr.getCanonicalHostName());
  }
}
 
Example 2
Project: hadoop   File: CompositeContext.java   Source Code and License 6 votes vote down vote up
@InterfaceAudience.Private
public void init(String contextName, ContextFactory factory) {
  super.init(contextName, factory);
  int nKids;
  try {
    String sKids = getAttribute(ARITY_LABEL);
    nKids = Integer.parseInt(sKids);
  } catch (Exception e) {
    LOG.error("Unable to initialize composite metric " + contextName +
              ": could not init arity", e);
    return;
  }
  for (int i = 0; i < nKids; ++i) {
    MetricsContext ctxt = MetricsUtil.getContext(
        String.format(SUB_FMT, contextName, i), contextName);
    if (null != ctxt) {
      subctxt.add(ctxt);
    }
  }
}
 
Example 3
Project: hadoop   File: FileSystem.java   Source Code and License 6 votes vote down vote up
/**
 * This method provides the default implementation of
 * {@link #access(Path, FsAction)}.
 *
 * @param stat FileStatus to check
 * @param mode type of access to check
 * @throws IOException for any error
 */
@InterfaceAudience.Private
static void checkAccessPermissions(FileStatus stat, FsAction mode)
    throws IOException {
  FsPermission perm = stat.getPermission();
  UserGroupInformation ugi = UserGroupInformation.getCurrentUser();
  String user = ugi.getShortUserName();
  List<String> groups = Arrays.asList(ugi.getGroupNames());
  if (user.equals(stat.getOwner())) {
    if (perm.getUserAction().implies(mode)) {
      return;
    }
  } else if (groups.contains(stat.getGroup())) {
    if (perm.getGroupAction().implies(mode)) {
      return;
    }
  } else {
    if (perm.getOtherAction().implies(mode)) {
      return;
    }
  }
  throw new AccessControlException(String.format(
    "Permission denied: user=%s, path=\"%s\":%s:%s:%s%s", user, stat.getPath(),
    stat.getOwner(), stat.getGroup(), stat.isDirectory() ? "d" : "-", perm));
}
 
Example 4
Project: hadoop-oss   File: SecurityUtil.java   Source Code and License 6 votes vote down vote up
/**
 * Login as a principal specified in config. Substitute $host in user's Kerberos principal 
 * name with hostname. If non-secure mode - return. If no keytab available -
 * bail out with an exception
 * 
 * @param conf
 *          conf to use
 * @param keytabFileKey
 *          the key to look for keytab file in conf
 * @param userNameKey
 *          the key to look for user's Kerberos principal name in conf
 * @param hostname
 *          hostname to use for substitution
 * @throws IOException if the config doesn't specify a keytab
 */
@InterfaceAudience.Public
@InterfaceStability.Evolving
public static void login(final Configuration conf,
    final String keytabFileKey, final String userNameKey, String hostname)
    throws IOException {
  
  if(! UserGroupInformation.isSecurityEnabled()) 
    return;
  
  String keytabFilename = conf.get(keytabFileKey);
  if (keytabFilename == null || keytabFilename.length() == 0) {
    throw new IOException("Running in secure mode, but config doesn't have a keytab");
  }

  String principalConfig = conf.get(userNameKey, System
      .getProperty("user.name"));
  String principalName = SecurityUtil.getServerPrincipal(principalConfig,
      hostname);
  UserGroupInformation.loginUserFromKeytab(principalName, keytabFilename);
}
 
Example 5
Project: hadoop   File: SecurityUtil.java   Source Code and License 6 votes vote down vote up
/**
 * Login as a principal specified in config. Substitute $host in user's Kerberos principal 
 * name with hostname. If non-secure mode - return. If no keytab available -
 * bail out with an exception
 * 
 * @param conf
 *          conf to use
 * @param keytabFileKey
 *          the key to look for keytab file in conf
 * @param userNameKey
 *          the key to look for user's Kerberos principal name in conf
 * @param hostname
 *          hostname to use for substitution
 * @throws IOException if the config doesn't specify a keytab
 */
@InterfaceAudience.Public
@InterfaceStability.Evolving
public static void login(final Configuration conf,
    final String keytabFileKey, final String userNameKey, String hostname)
    throws IOException {
  
  if(! UserGroupInformation.isSecurityEnabled()) 
    return;
  
  String keytabFilename = conf.get(keytabFileKey);
  if (keytabFilename == null || keytabFilename.length() == 0) {
    throw new IOException("Running in secure mode, but config doesn't have a keytab");
  }

  String principalConfig = conf.get(userNameKey, System
      .getProperty("user.name"));
  String principalName = SecurityUtil.getServerPrincipal(principalConfig,
      hostname);
  UserGroupInformation.loginUserFromKeytab(principalName, keytabFilename);
}
 
Example 6
Project: hadoop-oss   File: FileSystem.java   Source Code and License 6 votes vote down vote up
/**
 * This method provides the default implementation of
 * {@link #access(Path, FsAction)}.
 *
 * @param stat FileStatus to check
 * @param mode type of access to check
 * @throws IOException for any error
 */
@InterfaceAudience.Private
static void checkAccessPermissions(FileStatus stat, FsAction mode)
    throws IOException {
  FsPermission perm = stat.getPermission();
  UserGroupInformation ugi = UserGroupInformation.getCurrentUser();
  String user = ugi.getShortUserName();
  List<String> groups = Arrays.asList(ugi.getGroupNames());
  if (user.equals(stat.getOwner())) {
    if (perm.getUserAction().implies(mode)) {
      return;
    }
  } else if (groups.contains(stat.getGroup())) {
    if (perm.getGroupAction().implies(mode)) {
      return;
    }
  } else {
    if (perm.getOtherAction().implies(mode)) {
      return;
    }
  }
  throw new AccessControlException(String.format(
    "Permission denied: user=%s, path=\"%s\":%s:%s:%s%s", user, stat.getPath(),
    stat.getOwner(), stat.getGroup(), stat.isDirectory() ? "d" : "-", perm));
}
 
Example 7
Project: hadoop   File: MetricsRegistry.java   Source Code and License 6 votes vote down vote up
@InterfaceAudience.Private
public synchronized MutableRate newRate(String name, String desc,
    boolean extended, boolean returnExisting) {
  if (returnExisting) {
    MutableMetric rate = metricsMap.get(name);
    if (rate != null) {
      if (rate instanceof MutableRate) return (MutableRate) rate;
      throw new MetricsException("Unexpected metrics type "+ rate.getClass()
                                 +" for "+ name);
    }
  }
  checkMetricName(name);
  MutableRate ret = new MutableRate(name, desc, extended);
  metricsMap.put(name, ret);
  return ret;
}
 
Example 8
Project: hadoop   File: AbstractCounters.java   Source Code and License 6 votes vote down vote up
/**
 * Construct from another counters object.
 * @param <C1> type of the other counter
 * @param <G1> type of the other counter group
 * @param counters the counters object to copy
 * @param groupFactory the factory for new groups
 */
@InterfaceAudience.Private
public <C1 extends Counter, G1 extends CounterGroupBase<C1>>
AbstractCounters(AbstractCounters<C1, G1> counters,
                 CounterGroupFactory<C, G> groupFactory) {
  this.groupFactory = groupFactory;
  for(G1 group: counters) {
    String name = group.getName();
    G newGroup = groupFactory.newGroup(name, group.getDisplayName(), limits);
    (isFrameworkGroup(name) ? fgroups : groups).put(name, newGroup);
    for(Counter counter: group) {
      newGroup.addCounter(counter.getName(), counter.getDisplayName(),
                          counter.getValue());
    }
  }
}
 
Example 9
Project: hadoop   File: ServerRMProxy.java   Source Code and License 6 votes vote down vote up
@InterfaceAudience.Private
@Override
protected InetSocketAddress getRMAddress(YarnConfiguration conf,
                                         Class<?> protocol) {
  if (protocol == ResourceTracker.class) {
    return conf.getSocketAddr(
      YarnConfiguration.RM_RESOURCE_TRACKER_ADDRESS,
      YarnConfiguration.DEFAULT_RM_RESOURCE_TRACKER_ADDRESS,
      YarnConfiguration.DEFAULT_RM_RESOURCE_TRACKER_PORT);
  } else {
    String message = "Unsupported protocol found when creating the proxy " +
        "connection to ResourceManager: " +
        ((protocol != null) ? protocol.getClass().getName() : "null");
    LOG.error(message);
    throw new IllegalStateException(message);
  }
}
 
Example 10
Project: hadoop-oss   File: UserGroupInformation.java   Source Code and License 5 votes vote down vote up
@InterfaceAudience.Private
@VisibleForTesting
static void reset() {
  authenticationMethod = null;
  conf = null;
  groups = null;
  kerberosMinSecondsBeforeRelogin = 0;
  setLoginUser(null);
  HadoopKerberosName.setRules(null);
}
 
Example 11
Project: hadoop-oss   File: UserGroupInformation.java   Source Code and License 5 votes vote down vote up
/**
 * Return the current user, including any doAs in the current stack.
 * @return the current user
 * @throws IOException if login fails
 */
@InterfaceAudience.Public
@InterfaceStability.Evolving
public synchronized
static UserGroupInformation getCurrentUser() throws IOException {
  AccessControlContext context = AccessController.getContext();
  Subject subject = Subject.getSubject(context);
  if (subject == null || subject.getPrincipals(User.class).isEmpty()) {
    return getLoginUser();
  } else {
    return new UserGroupInformation(subject);
  }
}
 
Example 12
Project: hadoop-oss   File: UserGroupInformation.java   Source Code and License 5 votes vote down vote up
/**
 * Get the currently logged in user.
 * @return the logged in user
 * @throws IOException if login fails
 */
@InterfaceAudience.Public
@InterfaceStability.Evolving
public synchronized 
static UserGroupInformation getLoginUser() throws IOException {
  if (loginUser == null) {
    loginUserFromSubject(null);
  }
  return loginUser;
}
 
Example 13
Project: hadoop-oss   File: UserGroupInformation.java   Source Code and License 5 votes vote down vote up
/**
 * Log a user in from a keytab file. Loads a user identity from a keytab
 * file and logs them in. They become the currently logged-in user.
 * @param user the principal name to load from the keytab
 * @param path the path to the keytab file
 * @throws IOException if the keytab file can't be read
 */
@InterfaceAudience.Public
@InterfaceStability.Evolving
public synchronized
static void loginUserFromKeytab(String user,
                                String path
                                ) throws IOException {
  if (!isSecurityEnabled())
    return;

  keytabFile = path;
  keytabPrincipal = user;
  Subject subject = new Subject();
  LoginContext login; 
  long start = 0;
  try {
    login = newLoginContext(HadoopConfiguration.KEYTAB_KERBEROS_CONFIG_NAME,
          subject, new HadoopConfiguration());
    start = Time.now();
    login.login();
    metrics.loginSuccess.add(Time.now() - start);
    loginUser = new UserGroupInformation(subject);
    loginUser.setLogin(login);
    loginUser.setAuthenticationMethod(AuthenticationMethod.KERBEROS);
  } catch (LoginException le) {
    if (start > 0) {
      metrics.loginFailure.add(Time.now() - start);
    }
    throw new IOException("Login failure for " + user + " from keytab " + 
                          path+ ": " + le, le);
  }
  LOG.info("Login successful for user " + keytabPrincipal
      + " using keytab file " + keytabFile);
}
 
Example 14
Project: hadoop-oss   File: UserGroupInformation.java   Source Code and License 5 votes vote down vote up
/**
 * Log the current user out who previously logged in using keytab.
 * This method assumes that the user logged in by calling
 * {@link #loginUserFromKeytab(String, String)}.
 *
 * @throws IOException if a failure occurred in logout, or if the user did
 * not log in by invoking loginUserFromKeyTab() before.
 */
@InterfaceAudience.Public
@InterfaceStability.Evolving
public void logoutUserFromKeytab() throws IOException {
  if (!isSecurityEnabled() ||
      user.getAuthenticationMethod() != AuthenticationMethod.KERBEROS) {
    return;
  }
  LoginContext login = getLogin();
  if (login == null || keytabFile == null) {
    throw new IOException("loginUserFromKeytab must be done first");
  }

  try {
    if (LOG.isDebugEnabled()) {
      LOG.debug("Initiating logout for " + getUserName());
    }
    synchronized (UserGroupInformation.class) {
      login.logout();
    }
  } catch (LoginException le) {
    throw new IOException("Logout failure for " + user + " from keytab " +
        keytabFile + ": " + le,
        le);
  }

  LOG.info("Logout successful for user " + keytabPrincipal
      + " using keytab file " + keytabFile);
}
 
Example 15
Project: hadoop   File: CompositeContext.java   Source Code and License 5 votes vote down vote up
@InterfaceAudience.Private
@Override
public void unregisterUpdater(Updater updater) {
  for (MetricsContext ctxt : subctxt) {
    ctxt.unregisterUpdater(updater);
  }
}
 
Example 16
Project: hadoop-oss   File: UserGroupInformation.java   Source Code and License 5 votes vote down vote up
/**
 * Create a user from a login name. It is intended to be used for remote
 * users in RPC, since it won't have any credentials.
 * @param user the full user principal name, must not be empty or null
 * @return the UserGroupInformation for the remote user.
 */
@InterfaceAudience.Public
@InterfaceStability.Evolving
public static UserGroupInformation createRemoteUser(String user, AuthMethod authMethod) {
  if (user == null || user.isEmpty()) {
    throw new IllegalArgumentException("Null user");
  }
  Subject subject = new Subject();
  subject.getPrincipals().add(new User(user));
  UserGroupInformation result = new UserGroupInformation(subject);
  result.setAuthenticationMethod(authMethod);
  return result;
}
 
Example 17
Project: hadoop-oss   File: AvroSpecificSerialization.java   Source Code and License 5 votes vote down vote up
@InterfaceAudience.Private
@Override
public DatumReader getReader(Class<SpecificRecord> clazz) {
  try {
    return new SpecificDatumReader(clazz.newInstance().getSchema());
  } catch (Exception e) {
    throw new RuntimeException(e);
  }
}
 
Example 18
Project: hadoop-oss   File: UserGroupInformation.java   Source Code and License 5 votes vote down vote up
/**
 * get RealUser (vs. EffectiveUser)
 * @return realUser running over proxy user
 */
@InterfaceAudience.Public
@InterfaceStability.Evolving
public UserGroupInformation getRealUser() {
  for (RealUser p: subject.getPrincipals(RealUser.class)) {
    return p.getRealUser();
  }
  return null;
}
 
Example 19
Project: hadoop-oss   File: UserGroupInformation.java   Source Code and License 5 votes vote down vote up
/**
 * Create a UGI for testing HDFS and MapReduce
 * @param user the full user principal name
 * @param userGroups the names of the groups that the user belongs to
 * @return a fake user for running unit tests
 */
@InterfaceAudience.Public
@InterfaceStability.Evolving
public static UserGroupInformation createUserForTesting(String user, 
                                                        String[] userGroups) {
  ensureInitialized();
  UserGroupInformation ugi = createRemoteUser(user);
  // make sure that the testing object is setup
  if (!(groups instanceof TestingGroups)) {
    groups = new TestingGroups(groups);
  }
  // add the user groups
  ((TestingGroups) groups).setUserGroups(ugi.getShortUserName(), userGroups);
  return ugi;
}
 
Example 20
Project: hadoop   File: UserGroupInformation.java   Source Code and License 5 votes vote down vote up
/**
 * Run the given action as the user.
 * @param <T> the return type of the run method
 * @param action the method to execute
 * @return the value from the run method
 */
@InterfaceAudience.Public
@InterfaceStability.Evolving
public <T> T doAs(PrivilegedAction<T> action) {
  logPrivilegedAction(subject, action);
  return Subject.doAs(subject, action);
}
 
Example 21
Project: hadoop   File: FsServerDefaults.java   Source Code and License 5 votes vote down vote up
@Override
@InterfaceAudience.Private
public void write(DataOutput out) throws IOException {
  out.writeLong(blockSize);
  out.writeInt(bytesPerChecksum);
  out.writeInt(writePacketSize);
  out.writeShort(replication);
  out.writeInt(fileBufferSize);
  WritableUtils.writeEnum(out, checksumType);
}
 
Example 22
Project: hadoop-oss   File: AvroReflectSerialization.java   Source Code and License 5 votes vote down vote up
@InterfaceAudience.Private
@Override
public DatumReader getReader(Class<Object> clazz) {
  try {
    return new ReflectDatumReader(clazz);
  } catch (Exception e) {
    throw new RuntimeException(e);
  }
}
 
Example 23
Project: hadoop   File: CompositeContext.java   Source Code and License 5 votes vote down vote up
@InterfaceAudience.Private
@Override
public void stopMonitoring() {
  for (MetricsContext ctxt : subctxt) {
    ctxt.stopMonitoring();
  }
}
 
Example 24
Project: hadoop   File: DataNode.java   Source Code and License 5 votes vote down vote up
/**
 * Creates a dummy DataNode for testing purpose.
 */
@VisibleForTesting
@InterfaceAudience.LimitedPrivate("HDFS")
DataNode(final Configuration conf) {
  super(conf);
  this.blockScanner = new BlockScanner(this, conf);
  this.fileDescriptorPassingDisabledReason = null;
  this.maxNumberOfBlocksToLog = 0;
  this.confVersion = null;
  this.usersWithLocalPathAccess = null;
  this.connectToDnViaHostname = false;
  this.getHdfsBlockLocationsEnabled = false;
  this.pipelineSupportECN = false;
}
 
Example 25
Project: hadoop-oss   File: CompositeContext.java   Source Code and License 5 votes vote down vote up
@InterfaceAudience.Private
@Override
public void startMonitoring() throws IOException {
  for (MetricsContext ctxt : subctxt) {
    try {
      ctxt.startMonitoring();
    } catch (IOException e) {
      LOG.warn("startMonitoring failed: " + ctxt.getContextName(), e);
    }
  }
}
 
Example 26
Project: hadoop-oss   File: CompositeContext.java   Source Code and License 5 votes vote down vote up
@InterfaceAudience.Private
@Override
public void stopMonitoring() {
  for (MetricsContext ctxt : subctxt) {
    ctxt.stopMonitoring();
  }
}
 
Example 27
Project: hadoop-oss   File: CompositeContext.java   Source Code and License 5 votes vote down vote up
@InterfaceAudience.Private
@Override
public void registerUpdater(Updater updater) {
  for (MetricsContext ctxt : subctxt) {
    ctxt.registerUpdater(updater);
  }
}
 
Example 28
Project: hadoop-oss   File: GangliaContext.java   Source Code and License 5 votes vote down vote up
@Override
@InterfaceAudience.Private
public void init(String contextName, ContextFactory factory) {
  super.init(contextName, factory);
  parseAndSetPeriod(PERIOD_PROPERTY);
      
  metricsServers = 
    Util.parse(getAttribute(SERVERS_PROPERTY), DEFAULT_PORT); 
      
  unitsTable = getAttributeTable(UNITS_PROPERTY);
  slopeTable = getAttributeTable(SLOPE_PROPERTY);
  tmaxTable  = getAttributeTable(TMAX_PROPERTY);
  dmaxTable  = getAttributeTable(DMAX_PROPERTY);
  multicastEnabled = Boolean.parseBoolean(getAttribute(MULTICAST_PROPERTY));
  String multicastTtlValue = getAttribute(MULTICAST_TTL_PROPERTY);
  if (multicastEnabled) {
    if (multicastTtlValue == null) {
      multicastTtl = DEFAULT_MULTICAST_TTL;
    } else {
      multicastTtl = Integer.parseInt(multicastTtlValue);
    }
  }
      
  try {
    if (multicastEnabled) {
      LOG.info("Enabling multicast for Ganglia with TTL " + multicastTtl);
      datagramSocket = new MulticastSocket();
      ((MulticastSocket) datagramSocket).setTimeToLive(multicastTtl);
    } else {
      datagramSocket = new DatagramSocket();
    }
  } catch (IOException e) {
    LOG.error(e);
  }
}
 
Example 29
Project: hadoop   File: CompositeContext.java   Source Code and License 5 votes vote down vote up
@InterfaceAudience.Private
@Override
public void registerUpdater(Updater updater) {
  for (MetricsContext ctxt : subctxt) {
    ctxt.registerUpdater(updater);
  }
}
 
Example 30
Project: hadoop   File: UserGroupInformation.java   Source Code and License 5 votes vote down vote up
@InterfaceAudience.Private
@VisibleForTesting
static void reset() {
  authenticationMethod = null;
  conf = null;
  groups = null;
  setLoginUser(null);
  HadoopKerberosName.setRules(null);
}
 
Example 31
Project: hadoop   File: Find.java   Source Code and License 5 votes vote down vote up
/** Returns the current find options, creating them if necessary. */
@InterfaceAudience.Private
FindOptions getOptions() {
  if (options == null) {
    options = createOptions();
  }
  return options;
}
 
Example 32
Project: hadoop-oss   File: ContentSummary.java   Source Code and License 5 votes vote down vote up
@Override
@InterfaceAudience.Private
public void write(DataOutput out) throws IOException {
  out.writeLong(length);
  out.writeLong(fileCount);
  out.writeLong(directoryCount);
  out.writeLong(getQuota());
  out.writeLong(getSpaceConsumed());
  out.writeLong(getSpaceQuota());
}
 
Example 33
Project: hadoop-oss   File: LocalDirAllocator.java   Source Code and License 5 votes vote down vote up
/**
 * Removes the context from the context config items
 * 
 * @param contextCfgItemName
 */
@Deprecated
@InterfaceAudience.LimitedPrivate({"MapReduce"})
public static void removeContext(String contextCfgItemName) {
  synchronized (contexts) {
    contexts.remove(contextCfgItemName);
  }
}
 
Example 34
Project: hadoop   File: AbstractCounters.java   Source Code and License 5 votes vote down vote up
/** Add a group.
 * @param group object to add
 * @return the group
 */
@InterfaceAudience.Private
public synchronized G addGroup(G group) {
  String name = group.getName();
  if (isFrameworkGroup(name)) {
    fgroups.put(name, group);
  } else {
    limits.checkGroups(groups.size() + 1);
    groups.put(name, group);
  }
  return group;
}
 
Example 35
Project: hadoop   File: UserGroupInformation.java   Source Code and License 5 votes vote down vote up
/**
 * Re-Login a user in from the ticket cache.  This
 * method assumes that login had happened already.
 * The Subject field of this UserGroupInformation object is updated to have
 * the new credentials.
 * @throws IOException on a failure
 */
@InterfaceAudience.Public
@InterfaceStability.Evolving
public synchronized void reloginFromTicketCache()
throws IOException {
  if (!isSecurityEnabled() || 
      user.getAuthenticationMethod() != AuthenticationMethod.KERBEROS ||
      !isKrbTkt)
    return;
  LoginContext login = getLogin();
  if (login == null) {
    throw new IOException("login must be done first");
  }
  long now = Time.now();
  if (!hasSufficientTimeElapsed(now)) {
    return;
  }
  // register most recent relogin attempt
  user.setLastLogin(now);
  try {
    if (LOG.isDebugEnabled()) {
      LOG.debug("Initiating logout for " + getUserName());
    }
    //clear up the kerberos state. But the tokens are not cleared! As per 
    //the Java kerberos login module code, only the kerberos credentials
    //are cleared
    login.logout();
    //login and also update the subject field of this instance to 
    //have the new credentials (pass it to the LoginContext constructor)
    login = 
      newLoginContext(HadoopConfiguration.USER_KERBEROS_CONFIG_NAME, 
          getSubject(), new HadoopConfiguration());
    if (LOG.isDebugEnabled()) {
      LOG.debug("Initiating re-login for " + getUserName());
    }
    login.login();
    setLogin(login);
  } catch (LoginException le) {
    throw new IOException("Login failure for " + getUserName(), le);
  } 
}
 
Example 36
Project: hadoop   File: ContentSummary.java   Source Code and License 5 votes vote down vote up
@Override
@InterfaceAudience.Private
public void write(DataOutput out) throws IOException {
  out.writeLong(length);
  out.writeLong(fileCount);
  out.writeLong(directoryCount);
  out.writeLong(quota);
  out.writeLong(spaceConsumed);
  out.writeLong(spaceQuota);
}
 
Example 37
Project: hadoop-oss   File: Find.java   Source Code and License 5 votes vote down vote up
/** Returns the current find options, creating them if necessary. */
@InterfaceAudience.Private
FindOptions getOptions() {
  if (options == null) {
    options = createOptions();
  }
  return options;
}
 
Example 38
Project: hadoop-oss   File: FileContext.java   Source Code and License 5 votes vote down vote up
/**
 * Get delegation tokens for the file systems accessed for a given
 * path.
 * @param p Path for which delegations tokens are requested.
 * @param renewer the account name that is allowed to renew the token.
 * @return List of delegation tokens.
 * @throws IOException
 */
@InterfaceAudience.LimitedPrivate( { "HDFS", "MapReduce" })
public List<Token<?>> getDelegationTokens(
    Path p, String renewer) throws IOException {
  Set<AbstractFileSystem> afsSet = resolveAbstractFileSystems(p);
  List<Token<?>> tokenList = 
      new ArrayList<Token<?>>();
  for (AbstractFileSystem afs : afsSet) {
    List<Token<?>> afsTokens = afs.getDelegationTokens(renewer);
    tokenList.addAll(afsTokens);
  }
  return tokenList;
}
 
Example 39
Project: hadoop-oss   File: SaslRpcClient.java   Source Code and License 4 votes vote down vote up
@VisibleForTesting
@InterfaceAudience.Private
public Object getNegotiatedProperty(String key) {
  return (saslClient != null) ? saslClient.getNegotiatedProperty(key) : null;
}
 
Example 40
Project: hadoop-oss   File: SaslRpcClient.java   Source Code and License 4 votes vote down vote up
@InterfaceAudience.Private
public AuthMethod getAuthMethod() {
  return authMethod;
}