Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id A3150200D3B for ; Fri, 10 Nov 2017 15:35:12 +0100 (CET) Received: by cust-asf.ponee.io (Postfix) id A19C2160BEE; Fri, 10 Nov 2017 14:35:12 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id EA282160BCB for ; Fri, 10 Nov 2017 15:35:11 +0100 (CET) Received: (qmail 82780 invoked by uid 500); 10 Nov 2017 14:35:10 -0000 Mailing-List: contact common-dev-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Delivered-To: mailing list common-dev@hadoop.apache.org Received: (qmail 82768 invoked by uid 99); 10 Nov 2017 14:35:10 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd3-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 10 Nov 2017 14:35:10 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd3-us-west.apache.org (ASF Mail Server at spamd3-us-west.apache.org) with ESMTP id 1AB75183715 for ; Fri, 10 Nov 2017 14:28:04 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd3-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: -100.002 X-Spam-Level: X-Spam-Status: No, score=-100.002 tagged_above=-999 required=6.31 tests=[RP_MATCHES_RCVD=-0.001, SPF_PASS=-0.001, USER_IN_WHITELIST=-100] autolearn=disabled Received: from mx1-lw-eu.apache.org ([10.40.0.8]) by localhost (spamd3-us-west.apache.org [10.40.0.10]) (amavisd-new, port 10024) with ESMTP id GVyD42TFesDU for ; Fri, 10 Nov 2017 14:28:02 +0000 (UTC) Received: from mailrelay1-us-west.apache.org (mailrelay1-us-west.apache.org [209.188.14.139]) by mx1-lw-eu.apache.org (ASF Mail Server at mx1-lw-eu.apache.org) with ESMTP id 818DB5FE02 for ; Fri, 10 Nov 2017 14:28:01 +0000 (UTC) Received: from jira-lw-us.apache.org (unknown [207.244.88.139]) by mailrelay1-us-west.apache.org (ASF Mail Server at mailrelay1-us-west.apache.org) with ESMTP id BE82AE258F for ; Fri, 10 Nov 2017 14:28:00 +0000 (UTC) Received: from jira-lw-us.apache.org (localhost [127.0.0.1]) by jira-lw-us.apache.org (ASF Mail Server at jira-lw-us.apache.org) with ESMTP id 7948F240D6 for ; Fri, 10 Nov 2017 14:28:00 +0000 (UTC) Date: Fri, 10 Nov 2017 14:28:00 +0000 (UTC) From: "Andras Bokor (JIRA)" To: common-dev@hadoop.apache.org Message-ID: In-Reply-To: References: Subject: [jira] [Resolved] (HADOOP-9324) Out of date API document MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: 7bit X-JIRA-FingerPrint: 30527f35849b9dde25b450d4833f0394 archived-at: Fri, 10 Nov 2017 14:35:12 -0000 [ https://issues.apache.org/jira/browse/HADOOP-9324?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Andras Bokor resolved HADOOP-9324. ---------------------------------- Resolution: Duplicate I have raised HADOOP-15021 which is the root cause of most of the issues above. Others are ok. 1. Covered by HADOOP-15021 2. Covered by HADOOP-15021 3. Covered by HADOOP-15021 4. JoinCollector is not deleted 5. No longer an issue 6. Covered by HADOOP-15021 7. Covered by HADOOP-15021 8. Covered by HADOOP-15021 9. Covered by HADOOP-15021 10. JobContextImpl is not deleted. It will covered by HADOOP-15021 11. It is correct as it is 12. Covered by HADOOP-15021 13. Covered by HADOOP-15021 14. Covered by HADOOP-15021 15. Covered by HADOOP-15021 16. Package exists 17. Covered by HADOOP-15021 18. Covered by HADOOP-15021 19. No longer valid 20. Covered by HADOOP-15021 > Out of date API document > ------------------------ > > Key: HADOOP-9324 > URL: https://issues.apache.org/jira/browse/HADOOP-9324 > Project: Hadoop Common > Issue Type: Bug > Components: documentation > Affects Versions: 2.0.3-alpha > Reporter: Hao Zhong > > The documentation is out of date. Some code references are broken: > 1. http://hadoop.apache.org/docs/current/api/org/apache/hadoop/fs/FSDataInputStream.html > "All Implemented Interfaces: > Closeable, DataInput, *org.apache.hadoop.fs.ByteBufferReadable*, *org.apache.hadoop.fs.HasFileDescriptor*, PositionedReadable, Seekable " > 2.http://hadoop.apache.org/docs/current/api/org/apache/hadoop/mapreduce/Cluster.html > renewDelegationToken(*org.apache.hadoop.security.token.Token* token) > Deprecated. Use Token.renew(*org.apache.hadoop.conf.Configuration*) instead > 3.http://hadoop.apache.org/docs/current/api/org/apache/hadoop/mapred/JobConf.html > "Use MRAsyncDiskService.moveAndDeleteAllVolumes instead. " > I cannot find the MRAsyncDiskService class in the documentation of 2.0.3. > 4.http://hadoop.apache.org/docs/current/api/org/apache/hadoop/mapred/join/CompositeRecordReader.html > "protected *org.apache.hadoop.mapred.join.CompositeRecordReader.JoinCollector* jc" > Please globally search JoinCollector. It is deleted, but mentioned many times in the current documentation. > 5.http://hadoop.apache.org/docs/current/api/org/apache/hadoop/mapred/OutputCommitter.html > "abortJob(JobContext context, *org.apache.hadoop.mapreduce.JobStatus.State runState*)" > http://hadoop.apache.org/docs/current/api/org/apache/hadoop/mapreduce/Job.html > "public *org.apache.hadoop.mapreduce.JobStatus.State* getJobState()" > 4.http://hadoop.apache.org/docs/current/api/org/apache/hadoop/mapred/SequenceFileOutputFormat.html > " static *org.apache.hadoop.io.SequenceFile.CompressionType* getOutputCompressionType" > " static *org.apache.hadoop.io.SequenceFile.Reader[]* getReaders" > 5.http://hadoop.apache.org/docs/current/api/org/apache/hadoop/mapred/TaskCompletionEvent.html > "Returns enum Status.SUCESS or Status.FAILURE."->Status.SUCCEEDED? > 6.http://hadoop.apache.org/docs/current/api/org/apache/hadoop/mapreduce/Job.html > " static *org.apache.hadoop.mapreduce.Job.TaskStatusFilter* getTaskOutputFilter" > " org.apache.hadoop.mapreduce.TaskReport[] getTaskReports(TaskType type) " > 7.http://hadoop.apache.org/docs/current/api/org/apache/hadoop/mapreduce/Reducer.html > "cleanup(*org.apache.hadoop.mapreduce.Reducer.Context* context) " > 8.http://hadoop.apache.org/docs/current/api/org/apache/hadoop/mapred/SequenceFileOutputFormat.html > "static *org.apache.hadoop.io.SequenceFile.CompressionType* getOutputCompressionType(JobConf conf) > Get the *SequenceFile.CompressionType* for the output SequenceFile." > " static *org.apache.hadoop.io.SequenceFile.Reader[]* getReaders" > 9.http://hadoop.apache.org/docs/current/api/org/apache/hadoop/mapreduce/lib/partition/InputSampler.html > "writePartitionFile(Job job, *org.apache.hadoop.mapreduce.lib.partition.InputSampler.Sampler* sampler) " > 10.http://hadoop.apache.org/docs/current/api/org/apache/hadoop/mapreduce/lib/partition/TotalOrderPartitioner.html > contain JobContextImpl.getNumReduceTasks() - 1 keys. > The JobContextImpl class is already deleted. > 11. http://hadoop.apache.org/docs/current/api/org/apache/hadoop/mapreduce/OutputCommitter.html > "Note that this is invoked for jobs with final runstate as JobStatus.State.FAILED or JobStatus.State.KILLED."->JobStatus.FAILED JobStatus.KILLED? > 12.http://hadoop.apache.org/docs/current/api/org/apache/hadoop/mapred/TaskAttemptContext.html > "All Superinterfaces: > JobContext, *org.apache.hadoop.mapreduce.MRJobConfig*, Progressable, TaskAttemptContext " > 13.http://hadoop.apache.org/docs/current/api/org/apache/hadoop/metrics/file/FileContext.html > "All Implemented Interfaces: > *org.apache.hadoop.metrics.MetricsContext*" > 14.http://hadoop.apache.org/docs/current/api/org/apache/hadoop/metrics/spi/AbstractMetricsContext.html > "*org.apache.hadoop.metrics.MetricsRecord* createRecord(String recordName)" > 15. http://hadoop.apache.org/docs/current/api/org/apache/hadoop/net/DNSToSwitchMapping.html > "If a name cannot be resolved to a rack, the implementation should return NetworkTopology.DEFAULT_RACK." > NetworkTopology is deleted. > 16.http://hadoop.apache.org/docs/current/api/org/apache/hadoop/metrics2/package-summary.html > "myprefix.sink.file.class=org.hadoop.metrics2.sink.FileSink" -> > org.apache.hadoop.metrics2.sink.FileSink? > "org.apache.hadoop.metrics2.impl" -> The package is not found. > 17.http://hadoop.apache.org/docs/current/api/org/apache/hadoop/ha/HAServiceTarget.html > " abstract *org.apache.hadoop.ha.NodeFencer* getFencer() " > 18.http://hadoop.apache.org/docs/current/api/org/apache/hadoop/mapreduce/MarkableIterator.html > "MarkableIterator is a wrapper iterator class that implements the MarkableIteratorInterface. " > MarkableIteratorInterface is deleted. > 19.http://hadoop.apache.org/docs/current/api/org/apache/hadoop/metrics/spi/NoEmitMetricsContext.html > "A MetricsContext that does not emit data, but, unlike NullContextWithUpdate" > NullContextWithUpdate is deleted. > 20.http://hadoop.apache.org/docs/current/api/org/apache/hadoop/net/ConnectTimeoutException.html > "Thrown by NetUtils.connect(java.net.Socket, java.net.SocketAddress, int) " > The NetUtils class is deleted. > Please revise the documentation. -- This message was sent by Atlassian JIRA (v6.4.14#64029) --------------------------------------------------------------------- To unsubscribe, e-mail: common-dev-unsubscribe@hadoop.apache.org For additional commands, e-mail: common-dev-help@hadoop.apache.org