From Fedora Project Wiki
(Moved to https://fedoraproject.org/wiki/Changes/Hadoop)
 
(24 intermediate revisions by 3 users not shown)
Line 1: Line 1:
= Apache Hadoop 2.X =
This feature has been moved to https://fedoraproject.org/wiki/Changes/Hadoop


== Summary ==
History is preserved here for posterity.
Bring Apache Hadoop, the hottest open source big data platform, to Fedora, the hottest open source distribution. Fedora should be the best distribution for using Apache Hadoop.
 
This and other big data activities can be found going on in the [https://fedoraproject.org/wiki/Big_data_SIG Big Data SIG].
 
== Owner ==
* Name: [[User:matt | Matthew Farrellee]]
* Email: matt@fedoraproject.org
 
=== People involved ===
{|
! Name !! IRC !! Focus !! Additional
|-
| [[User:matt | Matthew Farrellee]]
| mattf
| keeping track, integration testing
| UTC-5
|-
| [[User:pmackinn | Peter MacKinnon]]
| pmackinn
| packaging
| UTC-5
|-
| [[User:rrati | Rob Rati]]
| rsquared
| packaging
| UTC-5
|-
| [[User:tstclair | Timothy St. Clair]]
| tstclair
| config, upstream tracking
| UTC-6
|-
| [[User:skottler | Sam Kottler]]
| skottler
| packaging
| UTC-5
|-
| [[User:gil | Gil Cattaneo]]
| gil
| packaging
| UTC+1
|-
| [[User:cicku | Christopher Meng]]
| cicku
| packaging, testing
| UTC+8
|}
 
== Current status ==
* Targeted release: [[Releases/20 | Fedora 20 ]]
* Last updated: 11 June 2013
* Percentage of completion
** Dependencies available in Fedora (missing since project initiation): 80%
** Adaptation of Hadoop 2.0.2a source via patches: 100%
** Hadoop spec completion: 80%
* Test suite
** hadoop-common Tests run: 1719, Failures: 12, Errors: 4, Skipped: 16
** hadoop-hdfs Tests run: 1443, Failures: 5, Errors: 1, Skipped: 3
** hadoop-yarn Tests run: 126, Failures: 0, Errors: 0, Skipped: 0
** hadoop-mapreduce Tests run: 59, Failures: 0, Errors: 0, Skipped: 0
** Disabled: 40 classes (see log below)
 
== Detailed Description ==
Apache Hadoop is a widely used, increasingly complete big data platform, with a strong open source community and growing ecosystem. The goal is to package and integrate the core of the Hadoop ecosystem for Fedora, allowing for immediate use and creating a base for the rest of the ecosystem.
 
 
== Benefit to Fedora ==
The Apache Hadoop software will be packaged and integrated with Fedora. The core of the Hadoop ecosystem will be available with Fedora and provide a base for additional packages.
 
 
== Scope ==
* Package the Apache Hadoop 2.0.2 software
* Package all dependencies needed for Apache Hadoop 2.0.2
* Skip package dependencies required for unit testing, record them in a dependency backlog for later cleanup
 
=== Approach ===
We are taking an iterative, depth-first approach to packaging. We do not have all the dependencies mapped out ahead of time. Dependencies are being tabulated into two groups:
# ''missing'' - the dependency being requested from a hadoop-common pom has not yet been packaged, reviewed or generated into fedora repos
# ''broken'' - the dependency requested is out of date with current fedora versions, and patches must be developed for inclusion in a hadoop rpm build that address any build, API or source code deltas
Note that a dependency may show up in both of these tables.
 
Anyone who wants to help should find an available dependency below, edit the table changing the state to Active and packager to yourself.
 
While packaging a dependency, test dependencies can be skipped. Testing will be done via integration testing periodically during packaging and then after packaging completes. Test dependencies that are skipped must be added to the [[#skip|Skipped dependencies]] table below.
 
If you are ''lucky enough'' to pick a dependency that itself has unpackaged dependencies, identify the sub-dependencies and add them to the bottom of the [[#deps|Dependencies]] table below, change your current dependency to Blocked and repeat.
 
If your dependency is already packaged but the version is incompatible, contact the package owner and resolve the incompatibility in a mutually satisfactory way. For instance:
 
* If the version available in Fedora is older, explore updating the package. If that is not possible, explore creating a package that includes a version in its name, e.g. pkgnameXY. Ultimately, the most recent version in Fedora should have the name pkgname while older versions have pkgnameXY. It may take a full Fedora release to rationalize package names. Make a note in the [[#deps|Dependencies]] table.
 
* If the version you need is older than the packaged version, consider creating a patch to use the newer version. If a patch is not viable, proceed by packaging the dependency with a version in its name, e.g. pkgnameXY. Make a note in the [[#deps|Dependencies]] table.
 
There is [http://www.jboss.org/tattletale tattletale] dependency graph data for both the [http://pmackinn.fedorapeople.org/tattletale/branch-2.0.2-alpha/ baseline branch] and the [http://pmackinn.fedorapeople.org/tattletale/fedora-2.0.2-alpha-integration/ fedora development branch].
 
=== Dependencies ===
{| class="wikitable"
|+ Missing dependency legend
! State !! Notes
|-
 
| '''<span style="color:darkviolet">Available</span>''' || free for someone to take
|-
| '''<span style="color:blue">Active</span>'''    || dependency is actively being packaged if missing, or patch is being developed or tested for inclusion in hadoop-common build
|-
| '''<span style="color:red">Blocked</span>'''  || pending packages for dependencies
|-
| '''<span style="color:orange">Review</span>'''    || under review, include link to review BZ
|-
| '''<span style="color:green">Complete</span>'''  || woohoo!
|}
 
{| class="wikitable"
|+ <div id="deps">Missing Dependencies</div>
! Project !! State !! Review BZ !! Packager !! Notes
|-
| hadoop
| '''<span style="color:blue">Active</span>'''
|
| [[User:rrati|rrati]],[[User:pmackinn|pmackinn]]
|
|-
| bookkeeper
| '''<span style="color:green">Complete</span>'''
| {{bz|948589}}
| [[User:gil|gil]]
| Version 4.0 requested. packaged 4.2.1. Patch: [https://issues.apache.org/jira/browse/BOOKKEEPER-598 BOOKKEEPER-598]
|-
| glassfish-gmbal
| '''<span style="color:green">Complete</span>'''
| {{bz|859112}}
| [[User:gil|gil]]
| [https://koji.fedoraproject.org/koji/buildinfo?buildID=413470 F18 build]
|-
| glassfish-management-api
| '''<span style="color:green">Complete</span>'''
| {{bz|859110}}
| [[User:gil|gil]]
| [https://koji.fedoraproject.org/koji/buildinfo?buildID=412579 F18 build]
|-
| grizzly
| '''<span style="color:green">Complete</span>'''
| {{bz|859114}}
| [[User:gil|gil]]
| Only for F20 for now. Cause: missing glassfish-servlet-api on [https://bugzilla.redhat.com/show_bug.cgi?id=959702 F18 and F19].
|-
| groovy
| '''<span style="color:green">Complete</span>'''
| {{bz|858127}}
| [[User:gil|gil]]
| 1.5 requested but 1.8 packaged in fedora.  Possible moving forward 1.8 series will be known as groovy18 and groovy will be 2.x.
|-
| jersey
| '''<span style="color:green">Complete</span>'''
| {{bz|825347}}
| [[User:gil|gil]]
| [https://koji.fedoraproject.org/koji/buildinfo?buildID=407315 F18 build] Should be rebuilt with grizzly2 support enabled.
|-
| jets3t
| '''<span style="color:green">Complete</span>'''
| {{bz|847109}}
| [[User:gil|gil]]
|
|-
| jspc-compiler
| '''<span style="color:green">Complete</span>'''
|{{bz|960720}}
|[[User:pmackinn|pmackinn]]
|Passes preliminary overall hadoop-common compilation/testing.
|-
| <strike> kfs </strike>
| <strike>'''<span style="color:orange">Review</span>'''</strike>
| <strike>{{bz|960728}}</strike>
| <strike>[[User:pmackinn|pmackinn]]</strike>
| <strike>kfs has become Quantcast qfs.</strike> '''No longer a dependency in 2.0.5-beta & > '''
|-
| maven-native
| '''<span style="color:green">Complete</span>'''
| {{bz|864084}}
| [[User:gil|gil]]
| Needs patch to build with java7. NOTE: javac target/source is already set by mojo.java.target option
|-
| zookeeper
| '''<span style="color:green">Complete</span>'''
| {{bz|823122}}
| [[User:gil|gil]]
| requires [https://koji.fedoraproject.org/koji/buildinfo?buildID=957337 jtoaster]
|}
{| class="wikitable"
|+ <div id="deps">Broken Dependencies</div>
! Project !! Packager !! Notes
|-
| ant
|
| Version 1.6 requested, 1.8 currently packaged in Fedora.  Needs to be inspected for API/functional incompatibilities(?)
|-
| apache-commons-collections
|[[User:pmackinn|pmackinn]]
| Java import compilation error with existing package.  Patches for hadoop-common being tracked at https://github.com/fedora-bigdata/hadoop-common/tree/fedora-patch-collections
|-
| apache-commons-math
|[[User:pmackinn|pmackinn]]
| Current apache-commons-math uses math3 in pom instead of math, and API changes in code. Patches for hadoop-common being tracked at https://github.com/fedora-bigdata/hadoop-common/tree/fedora-patch-math
|-
| cglib
|[[User:pmackinn|pmackinn]]
| Missing an explicit dep which old dep chain didn't need.. Patches for hadoop-common being tracked at https://github.com/fedora-bigdata/hadoop-common/tree/fedora-patch-cglib
|-
| ecj
| [[User:rrati|rrati]]
| Need ecj version ecj-4.2.1-6 or later to resolve a dependency lookup issue
|-
| gmaven
| [[User:gil|gil]]
| Version 1.0 requested, available 1.4 (but has broken deps) {{bz|914056}}
|-
| hadoop-hdfs
| [[User:pmackinn|pmackinn]]
| glibc link error in hdfs native build. Patch for hadoop-common being tracked at https://github.com/fedora-bigdata/hadoop-common/tree/fedora-patch-cmake-hdfs
|-
| hsqldb
| [[User:tradej|tradej]]
| 1.8 in fedora, update to 2.2.9 in the process. API compatibility to be checked.
|-
| jersey
| [[User:pmackinn|pmackinn]]
| Needs jersey-servlet and version. Tracked at https://github.com/fedora-bigdata/hadoop-common/tree/fedora-patch-jersey
|-
| jets3t
| [[User:pmackinn|pmackinn]]
| Requires 0.6.1. With 0.9.x: hadoop-common Jets3tNativeFileSystemStore.java error: incompatible types S3ObjectsChunk chunk = s3Service.listObjectsChunked(bucket.getName(). Patches for hadoop-common being tracked at https://github.com/fedora-bigdata/hadoop-common/tree/fedora-patch-jets3t
|-
| jetty
| [[User:rrati|rrati]]
| jetty8 packaged in Fedora, but 6.x requested. 6 and 8 are incompatible. Patches tracked at https://github.com/fedora-bigdata/hadoop-common/tree/fedora-patch-jetty
|-
| slf4j
|[[User:pmackinn|pmackinn]]
| Package in fedora fails to match in dependency resolution.  jcl104-over-slf4j dep in hadoop-common moved to jcl-over-slf4j as part of jspc/tomcat dep. Patch being tracked at https://github.com/fedora-bigdata/hadoop-common/tree/fedora-patch-jasper
|-
| tomcat-jasper
| [[User:pmackinn|pmackinn]]
| Version 5.5.x requested. Adaptations made for incumbent Tomcat 7 via patches at https://github.com/fedora-bigdata/hadoop-common/tree/fedora-patch-jasper. Reviewing fit as part of overall hadoop-common compilation/testing.
|}
 
=== Test Suite ===
{| class="wikitable"
|+ <div id="junit">Unit Test Log (for 2.0.2-alpha)</div>
! Module !! Name !! Baseline !! Fedora !! Tester !! Notes
|-
| hadoop-common
| TestDoAsEffectiveUser
| '''<span style="color:green">Pass</span>'''
| '''<span style="color:red">Fail</span>'''
| [[User:pmackinn|pmackinn]]
| Fails in hadoop-common test suite but frequently succeeds as standalone. testRealUserSetup
|-
| hadoop-common
| TestRPCCompatibility
| '''<span style="color:green">Pass</span>'''
| '''<span style="color:red">Fail</span>'''
| [[User:pmackinn|pmackinn]]
| Fails in hadoop-common test suite but frequently succeeds as standalone. testVersion2ClientVersion1Server: "expected:<3> but was:<-3>"
|-
| hadoop-common
| TestSSLHttpServer
| '''<span style="color:green">Pass</span>'''
| '''<span style="color:yellow">Fixed</span>'''
| [[User:pmackinn|pmackinn]]
| Required addition of SslContextFactory (new to Jetty 8+) setup in advance of SslConnector activation. Tracked [https://github.com/fedora-bigdata/hadoop-common/commit/2400ccf2885e581a435509c1bc6fd79023deb41b here].
|-
| hadoop-hdfs
| TestWebHdfsFileSystemContract
| '''<span style="color:green">Pass</span>'''
| '''<span style="color:red">Fail</span>'''
| [[User:pmackinn|pmackinn]]
| OOM, possibly inside jetty container, leading to apparent hang. Unclear if related to datastream IO failures or vice-versa.
|-
| hadoop-hdfs
| TestDelegationTokenForProxyUser
| '''<span style="color:green">Pass</span>'''
| '''<span style="color:red">Fail</span>'''
| [[User:pmackinn|pmackinn]]
| testWebHdfsDoAs: "expected:<200> but was:<401>". BasicAuth config?
|-
| hadoop-hdfs
| TestEditLogRace
| '''<span style="color:green">Pass</span>'''
| '''<span style="color:red">Fail</span>'''
| [[User:pmackinn|pmackinn]]
| testSaveRightBeforeSync
|-
| hadoop-hdfs
| TestHftpURLTimeouts
| '''<span style="color:green">Pass</span>'''
| '''<span style="color:red">Fail</span>'''
| [[User:pmackinn|pmackinn]]
| testHsftpSocketTimeout -> intermittent test failure
|-
| hadoop-hdfs
| TestNameNodeMetrics
| '''<span style="color:red">Fail</span>'''
| '''<span style="color:red">Fail</span>'''
| [[User:pmackinn|pmackinn]]
| testCorruptBlock: Bad value for metric PendingReplicationBlocks expected:<0> but was:<1> in test suite. Sporadic.
|-
| hadoop-hdfs-httpfs
| TestCheckpoint
| '''<span style="color:red">Fail</span>'''
| '''<span style="color:red">Fail</span>'''
| [[User:pmackinn|pmackinn]]
| testSecondaryHasVeryOutOfDateImage: Test resulted in an unexpected exit in test suite. Standalone OK.
|-
| hadoop-hdfs/src/contrib/bkjournal
| TestBookKeeperJournalManager
| '''<span style="color:green">Pass</span>'''
| '''<span style="color:red">Fail</span>'''
| [[User:pmackinn|pmackinn]]
| testOneBookieFailure, testAllBookieFailure: bookies not starting?
|-
| hadoop-hdfs/src/contrib/bkjournal
| TestBookKeeperAsHASharedDir
| '''<span style="color:green">Pass</span>'''
| '''<span style="color:red">Fail</span>'''
| [[User:pmackinn|pmackinn]]
| more bookies not starting?
|-
| hadoop-hdfs/src/contrib/bkjournal
| TestBookKeeperHACheckpoints
| '''<span style="color:green">Pass</span>'''
| '''<span style="color:red">Fail</span>'''
| [[User:pmackinn|pmackinn]]
| testCheckpointWhenNoNewTransactionsHappened: Port in use: localhost:10001 (but not really)
|-
| hadoop-yarn-server-nodemanager
| TestNMWebServices*
| '''<span style="color:green">Pass</span>'''
| '''<span style="color:red">Fail</span>'''
| [[User:pmackinn|pmackinn]]
| All tests in error with java.lang.InstantiationException: something with the adaptation of GuiceServlet
|-
| hadoop-yarn-server-resourcemanager
| TestRMWebServices*
| '''<span style="color:green">Pass</span>'''
| '''<span style="color:red">Fail</span>'''
| [[User:pmackinn|pmackinn]]
| All tests in error with java.lang.InstantiationException: same as above
|-
| hadoop-yarn-server-resourcemanager
| TestDelegationTokenRenewer
| '''<span style="color:green">Pass</span>'''
| '''<span style="color:red">Fail</span>'''
| [[User:pmackinn|pmackinn]]
| testDTRenewalWithNoCancel: renew wasn't called as many times as expected expected:<1> but was:<2>
|-
| hadoop-yarn-server-resourcemanager
| TestApplicationTokens
| '''<span style="color:green">Pass</span>'''
| '''<span style="color:red">Fail</span>'''
| [[User:pmackinn|pmackinn]]
| testTokenExpiry: sporadic NPE
|-
| hadoop-yarn-server-resourcemanager
| TestAppManager
| '''<span style="color:green">Pass</span>'''
| '''<span style="color:red">Fail</span>'''
| [[User:pmackinn|pmackinn]]
| testRMAppSubmit,testRMAppSubmitWithQueueAndName: app event type is wrong before expected:<KILL> but was:<APP_REJECTED>; sporadic
|-
| hadoop-yarn-client
| TestYarnClient
| '''<span style="color:red">Fail</span>'''
| '''<span style="color:red">Fail</span>'''
| [[User:pmackinn|pmackinn]]
| testClientStop: Can only configure with YarnConfiguration
|-
| hadoop-yarn-applications
| TestUnmanagedAMLauncher
| '''<span style="color:red">Fail</span>'''
| '''<span style="color:red">Fail</span>'''
| [[User:pmackinn|pmackinn]]
| Seems designed to execute once by successfully contacting an RM but repeatedly retries with: yarnAppState=FAILED, distributedFinalState=FAILED
|-
| hadoop-mapreduce-client-app
| TestAMWebServices*
| '''<span style="color:green">Pass</span>'''
| '''<span style="color:red">Fail</span>'''
| [[User:pmackinn|pmackinn]]
| All tests in error with java.lang.InstantiationException: same guice-servlet problem
|-
| hadoop-mapreduce-client-hs
| TestHsWebServices*
| '''<span style="color:green">Pass</span>'''
| '''<span style="color:red">Fail</span>'''
| [[User:pmackinn|pmackinn]]
| All tests in error with java.lang.InstantiationException: same guice-servlet problem
|-
|-
| hadoop-mapreduce-client-jobclient
| TestMiniMRProxyUser
| '''<span style="color:green">Pass</span>'''
| '''<span style="color:red">Fail</span>'''
| [[User:pmackinn|pmackinn]]
| testValidProxyUser: assert fail
|-
| hadoop-mapreduce-client-jobclient
| TestMRJobs
| '''<span style="color:green">Pass</span>'''
| '''<span style="color:red">Fail</span>'''
| [[User:pmackinn|pmackinn]]
| testDistributedCache: assert fail
|-
| hadoop-mapreduce-client-jobclient
| TestMapReduceLazyOutput
| '''<span style="color:green">Pass</span>'''
| '''<span style="color:red">Fail</span>'''
| [[User:pmackinn|pmackinn]]
| testLazyOutput: assert fail
|-
| hadoop-mapreduce-client-jobclient
| TestEncryptedShuffle
| '''<span style="color:green">Pass</span>'''
| '''<span style="color:red">Fail</span>'''
| [[User:pmackinn|pmackinn]]
| encryptedShuffleWithClientCerts,encryptedShuffleWithoutClientCerts: assert fail
|-
| hadoop-mapreduce-client-jobclient
| TestJobName
| '''<span style="color:green">Pass</span>'''
| '''<span style="color:red">Fail</span>'''
| [[User:pmackinn|pmackinn]]
| testComplexName,testComplexNameWithRegex: Job failed!
|-
| hadoop-mapreduce-client-jobclient
| TestJobSysDirWithDFS
| '''<span style="color:green">Pass</span>'''
| '''<span style="color:red">Fail</span>'''
| [[User:pmackinn|pmackinn]]
| testWithDFS: Job failed!
|-
| hadoop-mapreduce-client-jobclient
| TestClusterMapReduceTestCase
| '''<span style="color:green">Pass</span>'''
| '''<span style="color:red">Fail</span>'''
| [[User:pmackinn|pmackinn]]
| testMapReduce,testMapReduceRestarting: Job failed!
|-
| hadoop-mapreduce-client-jobclient
| TestLazyOutput
| '''<span style="color:green">Pass</span>'''
| '''<span style="color:red">Fail</span>'''
| [[User:pmackinn|pmackinn]]
| testLazyOutput: Job failed! 
|-
| hadoop-mapreduce-client-jobclient
| TestMiniMRWithDFSWithDistinctUsers
| '''<span style="color:green">Pass</span>'''
| '''<span style="color:red">Fail</span>'''
| [[User:pmackinn|pmackinn]]
| testDistinctUsers,testMultipleSpills: Job failed!
|-
| hadoop-mapreduce-client-jobclient
| TestRMNMInfo
| '''<span style="color:green">Pass</span>'''
| '''<span style="color:red">Fail</span>'''
| [[User:pmackinn|pmackinn]]
| testRMNMInfoMissmatch:
|-
| hadoop-distcp
| TestCopyCommitter
| '''<span style="color:green">Pass</span>'''
| '''<span style="color:red">Fail</span>'''
| [[User:pmackinn|pmackinn]]
| testNoCommitAction: Commit failed
|}
''Tests are listed in the order of execution''<br />
Baseline: F18, maven 3.0.5, Oracle JDK 1.6u45
 
=== Upstream Patch Tracking ===
Currenly tracking against branch-2 @ https://github.com/timothysc/hadoop-common
 
{| class="wikitable"
|+ <div id="junit">Modification Tracking</div>
! Branch !! Commiter !! JIRA !! Target !! Status
|-
| fedora-patch-math
| pmackinn
| https://issues.apache.org/jira/browse/HADOOP-9594
| 2.0.5-beta
| '''<span style="color:red">PENDING REVIEW</span>'''
|-
| <strike>fedora-patch-junit</strike>
| pmackinn
| https://issues.apache.org/jira/browse/HADOOP-9605
| 2.0.5-beta
| '''<span style="color:green">COMMITTED</span>'''
|-
| <strike>fedora-patch-javadocs</strike>
| rsquared
| https://issues.apache.org/jira/browse/HADOOP-9607
| 2.0.5-beta
| '''<span style="color:green">COMMITTED</span>'''
|-
| fedora-patch-collections
| pmackinn
| https://issues.apache.org/jira/browse/HADOOP-9610
| 2.0.5-beta
| '''<span style="color:red">PENDING REVIEW</span>'''
|-
| fedora-patch-cglib
| pmackinn
| https://issues.apache.org/jira/browse/HADOOP-9611
| 2.0.5-beta
| '''<span style="color:red">PENDING REVIEW</span>'''
|-
| fedora-patch-jersey
| pmackinn
| https://issues.apache.org/jira/browse/HADOOP-9613
| 2.0.5-beta
| '''<span style="color:red">PENDING REVIEW</span>'''
|-
| fedora-patch-jets3t
| pmackinn
| https://issues.apache.org/jira/browse/HADOOP-9623
| 2.0.5-beta
| '''<span style="color:red">PENDING REVIEW</span>'''
|-
| <strike>fedora-patch-cmake-hdfs</strike>
| <strike>pmackinn</strike>
| <strike>N/A</strike>
| 2.0.5-beta
| '''<span style="color:green">Already Modified Upstream</span>'''
|}
 
=== RPM spec ===
==== TODO ====
* xmvn conversion
* rpmlint cleanup
* secure datanode configuration/startup verification
 
==== Tests run ====
The following tests were run and passed against a hadoop setup from the rpms:
* pi
* randomwriter
* terragen
* terrasort
* terravalidate
 
== Packager Resources ==
=== Packager tips ===
* mvn-rpmbuild utility will ONLY resolve from system repo
* mvn-local will resolve from system repo first then fallback to maven if unresolved
** can be used to find the delta between system repo packages available and missing dependencies that can be viewed in the .m2 local maven repo (find ~/.m2/repository -name '*.jar')
* -Dmaven.local.debug=true
** reveals how JPP lookups are executed per dependency: useful for finding groupId,artifactId mismatches
* -Dmaven.test.skip=true
** tells maven to skip test runs AND compilation
** useful for unblocking end-to-end build
 
'''An alternative to gmaven:'''
*  apply a patch with the following content where required
*  test support is not guaranteed, should not work.
'''   
      <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-antrun-plugin</artifactId>
        <version>1.7</version>
        <dependencies>
          <dependency>
            <groupId>org.codehaus.groovy</groupId>
            <artifactId>groovy</artifactId>
            <version>any</version>
          </dependency>
          <dependency>
            <groupId>antlr</groupId>
            <artifactId>antlr</artifactId>
            <version>any</version>
          </dependency>
          <dependency>
            <groupId>commons-cli</groupId>
            <artifactId>commons-cli</artifactId>
            <version>any</version>
          </dependency>
          <dependency>
            <groupId>asm</groupId>
            <artifactId>asm-all</artifactId>
            <version>any</version>
          </dependency>
          <dependency>
            <groupId>org.slf4j</groupId>
            <artifactId>slf4j-nop</artifactId>
            <version>any</version>
          </dependency>
        </dependencies>
        <executions>
          <execution>
            <id>compile</id>
            <phase>process-sources</phase>
            <configuration>
              <target>
                <mkdir dir="${basedir}/target/classes"/>
                <taskdef name="groovyc" classname="org.codehaus.groovy.ant.Groovyc">
                  <classpath refid="maven.plugin.classpath"/>
                </taskdef>
                <groovyc destdir="${project.build.outputDirectory}" srcdir="${basedir}/src/main" classpathref="maven.compile.classpath">
                  <javac source="1.5" target="1.5" debug="on"/>
                </groovyc>
              </target>
            </configuration>
            <goals>
              <goal>run</goal>
            </goals>
          </execution>
        </executions>
      </plugin>'''
 
=== Repositories ===
An RPM repository of dependencies already packaged and in, or heading towards, review state can be found here:
 
http://repos.fedorapeople.org/repos/rrati/hadoop/
 
Currently, only Fedora 18 x86_64 packages are available
 
 
Source repositories:
 
https://github.com/fedora-bigdata/hadoop-common      Fork of Apache Hadoop for changes required to support compilation on Fedora
 
https://github.com/fedora-bigdata/hadoop-rpm        Spec and supporting files for generating an RPM for Fedora
 
 
=== Workflow ===
The Apache Hadoop project uses a number of old, or obsolete, dependencies in their build and test environment, and this presents a challenge for including Apache Hadoop into Fedora.  Any changes to the Apache Hadoop source or build files that is required in order to use a newer version of a dependency is a candidate for creating a patch to send upstream.  Any changes that are required to conform to Fedora's packaging guidelines or deal with a package naming issue should be contained to the hadoop spec file.
 
The intention of this process is to isolate changes to a single dependency so patches can be created that can be consumed upstream.  It is '''important''' that changes to the source be isolated to 1 dependency and the changes must be self-contained.  A dependency is not necessarily a single jar file.  Changes to a dependency should entail everything needed to use the jar files from a later release of the dependency.
 
 
==== Dependency Branches ====
All code/build changes to Apache Hadoop should be performed on a branch in the hadoop-common repo that should be based off the
 
:'''branch-2.0.2-alpha'''
 
branch and should following this naming convention:
 
:'''fedora-patch-<dependency>'''
 
Where <dependency> is the name of the dependency being worked on.  Changes to this branch should ONLY relate to the dependency being worked on.  Do not include the dependency version in the branch name.  These branches will be updated as needed because of Fedora or Hadoop updates until they are accepted upstream by Apache Hadoop.  Not having the dependency version allows the branch to move from version 1->2->3 without confusion if it is required before accepted upstream.
 
==== Integration Branch ====
An integration branch should be created in the hadoop-common repository that corresponds with the release version being packaged using the following naming convention:
 
:'''fedora-<version>-integration'''
 
where <ver> is the hadoop version being packaged.  All branches containing changes that have not yet been accepted upstream should be merged to the integration branch and the result should pass the build and all tests.  Once this is complete a patch should be generated and pushed to the hadoop-rpm repository.
 
==== Testing Changes ====
In order for a set of changes to be considered complete, it must be able to compile and pass all tests in 2 separate ways:
 
# On Fedora using Fedora packages (mvn-rpmbuild)
# On Fedora using maven retrieved packages (mvn)
 
The changes should compile and the build process should run through all tests without error.  To verify a set of changes, use the following options:
 
:'''<mvn-build> -Pnative install'''
 
Where <mvn-build> is either mvn-rpmbuild or mvn.
 
NOTE: ''This pirates' code is more what you'd call guidelines than actual rules. There are places where incompatible changes exist (at least for now): for example, the zookeeper test jar.''
 
== How To Test ==
<!-- This does not need to be a full-fledged document.  Describe the dimensions of tests that this feature is expected to pass when it is done.  If it needs to be tested with different hardware or software configurations, indicate them.  The more specific you can be, the better the community testing can be.
 
Remember that you are writing this how to for interested testers to use to check out your feature - documenting what you do for testing is OK, but it's much better to document what *I* can do to test your feature.
 
A good "how to test" should answer these four questions:
 
0. What special hardware / data / etc. is needed (if any)?
1. How do I prepare my system to test this feature? What packages
need to be installed, config files edited, etc.?
2. What specific actions do I perform to check that the feature is
working like it's supposed to?
3. What are the expected results of those actions?
-->
 
=== Source test suite ===
In order to ''attempt'' to run any part of the test suite, you must first build the components (F18 for now):
<code>
# git clone git://github.com/fedora-bigdata/hadoop-common.git
# cd hadoop-common
# git checkout -b  fedora-2.0.2-alpha-test origin/fedora-2.0.2-alpha-test
# mvn-rpmbuild -Pdist,native -DskipTest -DskipTests -DskipIT install
</code>
If you are interested in the whole ball of wax then
<code>
* mvn-rpmbuild -X -Dorg.apache.jasper.compiler.disablejsr199=true{{ref|jsr109}} test
</code>
and go mow a football field or knit a sweater. Note that this could still result in spurious failures. Add <code>-Dmaven.test.failure.ignore=true</code> to the above line if you're seeking just test errors.
 
The [https://github.com/fedora-bigdata/hadoop-common/tree/fedora-2.0.2-alpha-test fedora-2.0.2-alpha-test branch] excludes identified consistently failing tests. You can edit your copy of [https://github.com/fedora-bigdata/hadoop-common/blob/fedora-2.0.2-alpha-test/hadoop-project/pom.xml#L909 hadoop-project/pom.xml] to bring any of them back into play.
 
If you are interested in investigating specific failures such as [[#Test_Suite|active ones from the table above]] then target the module, test class, and even method as you see fit:
<code>
* mvn-rpmbuild -X -pl :hadoop-common test -Dtest=TestSSLHttpServer#testEcho
</code>
 
All your hard work results in a patch? Great! Hit a [[#People_involved|contributor]] up with it and we'll review and apply if everything looks cool.
 
{{note|jsr109}}This option is required to ensure the test of TestHttpServer#testContentTypes passes due to the use of glassfish JSP support.
 
=== Holistic ===
# '''TODO: NEEDS MORE DEFINITION'''
# yum install X Y Z across one or more nodes
# Setup a simple cluster by following '''TBD'''
# Run http://hadoop.apache.org/docs/stable/gridmix.html
 
== User Experience ==
For users who are interested in running Apache Hadoop on Fedora, they will find it available from Fedora Project yum repositories.
 
'''TODO: SPECIFICALLY PACKAGES X Y Z'''
 
 
== Dependencies ==
No other packages currently depend on Apache Hadoop.
 
Completion of this feature will involve packaging numerous dependencies, see [[#deps|the Dependencies table]]. Some of the dependencies are already being packaged by others in the Fedora community. Where dependency overlap is found, a negotaition must occur to ensure a satisfactory version and package is available to all parties.
 
'''TODO: Is https://fedoraproject.org/wiki/Hypertable ?'''
 
 
== Contingency Plan ==
With no packages depending on Apache Hadoop, none is necessary. The biggest risk is not completing packages for all dependencies. In that case, the feature can be removed from the release notes. The packaged dependencies should remain in the distribution. The feature can be pushed to the next Fedora release.
 
 
== Documentation ==
* http://wiki.apache.org/hadoop
* http://sochotni.fedorapeople.org/java-packaging-howto/
* http://mizdebsk.fedorapeople.org/xmvn/site/
 
== Release Notes ==
<!-- The Fedora Release Notes inform end-users about what is new in the release.  Examples of past release notes are here: http://docs.fedoraproject.org/release-notes/ -->
<!-- The release notes also help users know how to deal with platform changes such as ABIs/APIs, configuration or data file formats, or upgrade concerns.  If there are any such changes involved in this feature, indicate them here.  You can also link to upstream documentation if it satisfies this need.  This information forms the basis of the release notes edited by the documentation team and shipped with the release. -->
* '''TODO'''
 
 
== Comments and Discussion ==
* See [[Talk:Features/Hadoop]]
 
 
[[Category:FeaturePageIncomplete]]
<!-- When your feature page is completed and ready for review -->
<!-- remove Category:FeaturePageIncomplete and change it to Category:FeatureReadyForWrangler -->
<!-- After review, the feature wrangler will move your page to Category:FeatureReadyForFesco... if it still needs more work it will move back to Category:FeaturePageIncomplete-->
<!-- A pretty picture of the page category usage is at: https://fedoraproject.org/wiki/Features/Policy/Process -->

Latest revision as of 21:17, 8 July 2013

This feature has been moved to https://fedoraproject.org/wiki/Changes/Hadoop

History is preserved here for posterity.