lucene-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From rm...@apache.org
Subject svn commit: r1160700 [17/22] - in /lucene/dev/branches/flexscoring: ./ dev-tools/eclipse/ dev-tools/idea/.idea/ dev-tools/idea/lucene/contrib/ dev-tools/idea/lucene/contrib/demo/ dev-tools/idea/lucene/contrib/highlighter/ dev-tools/idea/lucene/contrib/...
Date Tue, 23 Aug 2011 14:07:19 GMT
Modified: lucene/dev/branches/flexscoring/solr/CHANGES.txt
URL: http://svn.apache.org/viewvc/lucene/dev/branches/flexscoring/solr/CHANGES.txt?rev=1160700&r1=1160699&r2=1160700&view=diff
==============================================================================
--- lucene/dev/branches/flexscoring/solr/CHANGES.txt (original)
+++ lucene/dev/branches/flexscoring/solr/CHANGES.txt Tue Aug 23 14:06:58 2011
@@ -28,6 +28,7 @@ Apache Tika 0.8
 Carrot2 3.5.0
 Velocity 1.6.4 and Velocity Tools 2.0
 Apache UIMA 2.3.1-SNAPSHOT
+Apache ZooKeeper 3.3.3
 
 
 Upgrading from Solr 3.3-dev
@@ -52,7 +53,9 @@ Upgrading from Solr 3.3-dev
   If q.op is effectively "OR" then mm=0%.  Users who wish to force the
   legacy behavior should set a default value for the 'mm' param in
   their solrconfig.xml file.
-
+  
+* FacetComponent no longer catches and embeds exceptions occurred during facet
+  processing, it throws HTTP 400 or 500 exceptions instead.
 
 Detailed Change List
 ----------------------
@@ -96,7 +99,9 @@ New Features
 
 * SOLR-1566: Transforming documents in the ResponseWriters.  This will allow
   for more complex results in responses and open the door for function queries
-  as results. (ryan with patches from grant, noble, cmale, yonik, Jan Høydahl)
+  as results. 
+  (ryan with patches from grant, noble, cmale, yonik, Jan Høydahl, 
+  Arul Kalaipandian, hossman)
 
 * SOLR-2396: Add CollationField, which is much more efficient than 
   the Solr 3.x CollationKeyFilterFactory, and also supports 
@@ -154,6 +159,13 @@ New Features
   for faster reopen times. There is also a new 'soft' autocommit tracker that can be
   configured. (Mark Miller, Robert Muir)
 
+* SOLR-2399: Updated Solr Admin interface.  New look and feel with per core administration
+  and many new options.  (Stefan Matheis via ryan)
+
+* SOLR-1032: CSV handler now supports "literal.field_name=value" parameters.
+  (Simon Rosenthal, ehatcher)
+
+
 Optimizations
 ----------------------
 
@@ -180,6 +192,10 @@ Optimizations
   on each commit (ie commits no longer wait for background merges to complete), 
   works with SolrCore to provide faster 'soft' commits, and has an improved API 
   that requires less instanceof special casing. (Mark Miller, Robert Muir)
+  Additional Work:
+  SOLR-2697: commit and autocommit operations don't reset 
+  DirectUpdateHandler2.numDocsPending stats attribute.
+  (Alexey Serba, Mark Miller)
 
 Bug Fixes
 ----------------------
@@ -208,9 +224,15 @@ Bug Fixes
   
 * SOLR-2193, SOLR-2565, SOLR-2651: SolrCores now properly share IndexWriters across SolrCore reloads.
   (Mark Miller, Robert Muir)
-  
-* SOLR-2535: REGRESSION: in Solr 3.x and trunk the admin/file handler 
-  fails to show directory listings (David Smiley, Peter Wolanin via Erick Erickson)
+  Additional Work:
+  SOLR-2705: On reload, IndexWriterProvider holds onto the initial SolrCore it was created with.
+  (Yury Kats, Mark Miller)
+
+* SOLR-2682: Remove addException() in SimpleFacet. FacetComponent no longer catches and embeds
+  exceptions occurred during facet processing, it throws HTTP 400 or 500 exceptions instead. (koji)
+
+* SOLR-2654: Directorys used by a SolrCore are now closed when they are no longer used.
+  (Mark Miller)
   
 Other Changes
 ----------------------
@@ -267,29 +289,19 @@ Other Changes
 * SOLR-1825: SolrQuery.addFacetQuery now enables facets automatically, like
   addFacetField (Chris Male)
 
-* SOLR-2452: Rewrote the Solr build system:
-  - Integrated more fully with the Lucene build system: generalized the
-    Lucene build system and eliminated duplication.
-  - Converted all Solr contribs to the Lucene/Solr conventional src/ layout:
-    java/, resources/, test/, and test-files/.
-  - Created a new Solr-internal module named "core" by moving the java/,
-    test/, and test-files/ directories from solr/src/ to solr/core/src/.
-  - Merged solr/src/webapp/src/ into solr/core/src/java/.
-  - Eliminated solr/src/ by moving all its directories up one level;
-    renamed solr/src/site/ to solr/site-src/ because solr/site/ already
-    exists.
-  - Merged solr/src/common/ into solr/solrj/src/java/.
-  - Moved o.a.s.client.solrj.* and o.a.s.common.* tests from
-    solr/src/test/ to solr/solrj/src/test/.
-  - Made the solrj tests not depend on the solr core tests by moving
-    some classes from solr/src/test/ to solr/test-framework/src/java/.
-  - Each internal module (core/, solrj/, test-framework/, and webapp/)
-    now has its own build.xml, from which it is possible to run
-    module-specific targets.  solr/build.xml delegates all build
-    tasks (via <ant dir="internal-module-dir"> calls) to these
-    modules' build.xml files.
-  (Steve Rowe, Robert Muir)
+* SOLR-2663: FieldTypePluginLoader has been refactored out of IndexSchema 
+  and made public. (hossman)
 
+* SOLR-2331,SOLR-2691: Refactor CoreContainer's SolrXML serialization code and improve testing
+  (Yury Kats, hossman, Mark Miller)
+  
+* SOLR-2698: Enhance CoreAdmin STATUS command to return index size.
+  (Yury Kats, hossman, Mark Miller)
+  
+* SOLR-2654: The same Directory instance is now always used across a SolrCore so that
+  it's easier to add other DirectoryFactory's without static caching hacks.
+  (Mark Miller)
+  
 Documentation
 ----------------------
 
@@ -307,6 +319,19 @@ Upgrading from Solr 3.3
   before the master.  If the master were to be updated first, the older
   searchers would not be able to read the new index format.
 
+* Previous versions of Solr silently allow and ignore some contradictory 
+  properties specified in schema.xml.  For example:
+    - indexed="false" omitNorms="false" 
+    - indexed="false" omitTermFreqAndPositions="false"
+  Field property validation has now been fixed, to ensure that
+  contradictions like these now generate error messages.  If users
+  have existing schemas that generate one of these new "conflicting
+  'false' field options for non-indexed field" error messages the
+  conflicting "omit*" properties can safely be removed, or changed to
+  "true" for consistent behavior with previous Solr versions.  This
+  situation has now been fixed to cause an error on startup when these
+  contradictory options.  See SOLR-2669.
+
 New Features
 ----------------------
 
@@ -336,6 +361,27 @@ New Features
 * LUCENE-2048: Added omitPositions to the schema, so you can omit position
   information while still indexing term frequencies.  (rmuir)
 
+* SOLR-2584: add UniqFieldsUpdateProcessor that removes duplicate values in the
+  specified fields. (Elmer Garduno, koji)
+
+* SOLR-2670: Added NIOFSDirectoryFactory (yonik)
+
+* SOLR-2523: Added support in SolrJ to easily interact with range facets.
+  The range facet response can be parsed and is retrievable from the
+  QueryResponse class. The SolrQuery class has convenient methods for using
+  range facets. (Martijn van Groningen)
+
+* SOLR-2637: Added support for group result parsing in SolrJ.
+  (Tao Cheng, Martijn van Groningen)
+
+* SOLR-2665: Added post group faceting. Facet counts are based on the most
+  relevant document of each group matching the query. This feature has the
+  same impact on the StatsComponent. (Martijn van Groningen)
+
+* SOLR-2675: CoreAdminHandler now allows arbitrary properties to be
+  specified when CREATEing a new SolrCore using property.* request
+  params.  (Yury Kats, hossman)
+
 Optimizations
 ----------------------
 
@@ -375,12 +421,62 @@ Bug Fixes
 
 * SOLR-2642: Fixed sorting by function when using grouping. (Thomas Heigl, Martijn van Groningen)
 
+* SOLR-2535: REGRESSION: in Solr 3.x and trunk the admin/file handler
+  fails to show directory listings (David Smiley, Peter Wolanin via Erick Erickson)
+
+* SOLR-2545: ExternalFileField file parsing would fail if any key
+  contained an "=" character.  It now only looks for the last "=" delimiter 
+  prior to the float value.
+  (Markus Jelsma, hossman)
+
+* SOLR-2662: When Solr is configured to have no queryResultCache, the
+  "start" parameter was not honored and the documents returned were
+   0 through start+offset.  (Markus Jelsma, yonik)
+
+* SOLR-2669: Fix backwards validation of field properties in 
+  SchemaField.calcProps (hossman)
+
+* SOLR-2676: Add "welcome-file-list" to solr.war so admin UI works correctly 
+  in servlet containers such as WebSphere that do not use a default list
+  (Jay R. Jaeger, hossman)
+
+* SOLR-2606: Fixed sort parsing of fields containing punctuation that 
+  failed due to sort by function changes introduced in SOLR-1297
+  (Mitsu Hadeishi, hossman)
+
  Other Changes
 ----------------------
 
+* SOLR-2629: Eliminate deprecation warnings in some JSPs.
+  (Bernd Fehling, hossman)
+
 Build
 ----------------------
 
+* SOLR-2452,SOLR-2653,LUCENE-3323,SOLR-2659,LUCENE-3329,SOLR-2666:
+  Rewrote the Solr build system:
+  - Integrated more fully with the Lucene build system: generalized the
+    Lucene build system and eliminated duplication.
+  - Converted all Solr contribs to the Lucene/Solr conventional src/ layout:
+    java/, resources/, test/, and test-files/<contrib-name>.
+  - Created a new Solr-internal module named "core" by moving the java/,
+    test/, and test-files/ directories from solr/src/ to solr/core/src/.
+  - Merged solr/src/webapp/src/ into solr/core/src/java/.
+  - Eliminated solr/src/ by moving all its directories up one level;
+    renamed solr/src/site/ to solr/site-src/ because solr/site/ already
+    exists.
+  - Merged solr/src/common/ into solr/solrj/src/java/.
+  - Moved o.a.s.client.solrj.* and o.a.s.common.* tests from
+    solr/src/test/ to solr/solrj/src/test/.
+  - Made the solrj tests not depend on the solr core tests by moving
+    some classes from solr/src/test/ to solr/test-framework/src/java/.
+  - Each internal module (core/, solrj/, test-framework/, and webapp/)
+    now has its own build.xml, from which it is possible to run
+    module-specific targets.  solr/build.xml delegates all build
+    tasks (via <ant dir="internal-module-dir"> calls) to these
+    modules' build.xml files.
+  (Steve Rowe, Robert Muir)
+
 Documentation
 ----------------------
 

Modified: lucene/dev/branches/flexscoring/solr/build.xml
URL: http://svn.apache.org/viewvc/lucene/dev/branches/flexscoring/solr/build.xml?rev=1160700&r1=1160699&r2=1160700&view=diff
==============================================================================
--- lucene/dev/branches/flexscoring/solr/build.xml (original)
+++ lucene/dev/branches/flexscoring/solr/build.xml Tue Aug 23 14:06:58 2011
@@ -40,7 +40,7 @@
   <!-- ========================================================================= -->
  
   <target name="example" description="Creates a runnable example configuration."
-          depends="prep-lucene-jars,dist-contrib,dist-war,build-contrib">
+          depends="dist-contrib,dist-war">
     <copy file="${dist}/${fullnamever}.war"
           tofile="${example}/webapps/${ant.project.name}.war"/>
     <jar destfile="${example}/exampledocs/post.jar"
@@ -88,6 +88,13 @@
     <setproxy proxyhost="${proxy.host}" proxyport="${proxy.port}" proxyuser="${proxy.user}" proxypassword="${proxy.password}"/>
   </target>
  
+
+  <target name="compile-xml-query-parser">
+  	<ant dir="${common.dir}/contrib/xml-query-parser" target="compile-core" inheritAll="false">
+      <propertyset refid="uptodate.and.compiled.properties"/>
+    </ant>
+  </target>
+
   <property  name="luke.version" value="1.0.1"/>
   <available file="luke/luke-${luke.version}.jar" property="luke.jar.exists" />
   <target name="luke-download" unless="luke.jar.exists" depends="proxy.setup,compile-xml-query-parser">
@@ -120,13 +127,13 @@
   <!-- ========================================================================= -->
   
   <target name="compile" description="Compile the source code."
-          depends="compile-core, build-contrib"/>
+          depends="compile-core, compile-contrib"/>
   <target name="test" description="Validate, then run core, solrj, and contrib unit tests."
-          depends="validate-solr, test-core, test-jsp, test-contrib"/>
+          depends="validate-solr, test-core, test-contrib"/>
   <target name="test-core" description="Runs the core and solrj unit tests."
           depends="test-solr-core, test-solrj"/>
   <target name="compile-test" description="Compile unit tests."
-          depends="compile-solr-test-framework, compile-test-solr-core, compile-test-solrj"/>
+          depends="compile-solr-test-framework, compile-test-solr-core, compile-test-solrj, compile-test-contrib, test-jsp"/>
   <target name="javadocs" description="Calls javadocs-all" depends="javadocs-all"/>
   <target name="compile-core" depends="compile-solr-core" unless="solr.core.compiled"/>
   
@@ -139,18 +146,21 @@
   
   <!-- Solrj targets -->
   <target name="test-solrj" description="Test java client">
-    <ant dir="solrj" target="test" inheritAll="false"/>
+    <ant dir="solrj" target="test" inheritAll="false">
+      <propertyset refid="uptodate.and.compiled.properties"/>
+    </ant>
   </target>
   
   <!-- Solr contrib targets -->
-  <target name="test-contrib" description="Run contrib unit tests."
-          depends="build-contrib">
+  <target name="test-contrib" description="Run contrib unit tests.">
     <contrib-crawl target="test" failonerror="true"/>
   </target>
   
   <!-- test-framework targets -->
   <target name="javadocs-test-framework">  <!-- Called from Jenkins build script --> 
-    <ant dir="test-framework" target="javadocs" inheritAll="false"/>
+    <ant dir="test-framework" target="javadocs" inheritAll="false">
+      <propertyset refid="uptodate.and.compiled.properties"/>
+  	</ant>
   </target>
   
   <!-- Webapp targets -->
@@ -379,11 +389,16 @@
     </copy>
   </target>
  
-  <target name="javadocs-all" depends="compile,javadocs-dep"
+  <target name="javadocs-all" depends="prep-lucene-jars,javadocs-dep"
           description="Generate javadoc for core, java client and contrib">
     <sequential>
       <mkdir dir="${dest}/docs/api"/>
  
+      <!-- TODO: optimize this, thats stupid here: -->
+      <subant target="module-jars-to-solr">
+        <fileset dir="contrib/analysis-extras" includes="build.xml"/>
+      </subant>
+
       <path id="javadoc.classpath">
         <path refid="classpath"/>
         <fileset dir="${dest}/contrib">

Modified: lucene/dev/branches/flexscoring/solr/client/ruby/solr-ruby/lib/solr/indexer.rb
URL: http://svn.apache.org/viewvc/lucene/dev/branches/flexscoring/solr/client/ruby/solr-ruby/lib/solr/indexer.rb?rev=1160700&r1=1160699&r2=1160700&view=diff
==============================================================================
--- lucene/dev/branches/flexscoring/solr/client/ruby/solr-ruby/lib/solr/indexer.rb (original)
+++ lucene/dev/branches/flexscoring/solr/client/ruby/solr-ruby/lib/solr/indexer.rb Tue Aug 23 14:06:58 2011
@@ -42,7 +42,7 @@ class Solr::Indexer
     end
     add_docs(buffer) if !buffer.empty?
     
-    @solr.commit unless @debug
+    @solr.commit unless @debug  # TODO: provide option to not commit
   end
   
   def add_docs(documents)

Modified: lucene/dev/branches/flexscoring/solr/common-build.xml
URL: http://svn.apache.org/viewvc/lucene/dev/branches/flexscoring/solr/common-build.xml?rev=1160700&r1=1160699&r2=1160700&view=diff
==============================================================================
--- lucene/dev/branches/flexscoring/solr/common-build.xml (original)
+++ lucene/dev/branches/flexscoring/solr/common-build.xml Tue Aug 23 14:06:58 2011
@@ -78,94 +78,12 @@
     -->
   <property name="solr.spec.version" value="4.0.0.${dateversion}" />
 
-  <!-- solr depends on the following modules/contribs -->	
-  <module-uptodate name="analysis/common" jarfile="${common.dir}/../modules/analysis/build/common/lucene-analyzers-common-${version}.jar"
-        property="analyzers-common.uptodate" classpath.property="analyzers-common.jar"/>
-  <module-uptodate name="analysis/phonetic" jarfile="${common.dir}/../modules/analysis/build/phonetic/lucene-analyzers-phonetic-${version}.jar"
-        property="analyzers-phonetic.uptodate" classpath.property="analyzers-phonetic.jar"/>
-  <module-uptodate name="suggest" jarfile="${common.dir}/../modules/suggest/build/lucene-suggest-${version}.jar"
-        property="suggest.uptodate" classpath.property="suggest.jar"/>
-  <module-uptodate name="grouping" jarfile="${common.dir}/../modules/grouping/build/lucene-grouping-${version}.jar"
-        property="grouping.uptodate" classpath.property="grouping.jar"/>
-  <module-uptodate name="queries" jarfile="${common.dir}/../modules/queries/build/lucene-queries-${version}.jar"
-        property="queries.uptodate" classpath.property="queries.jar"/>
-  <module-uptodate name="queryparser" jarfile="${common.dir}/../modules/queryparser/build/lucene-queryparser-${version}.jar"
-        property="queryparser.uptodate" classpath.property="queryparser.jar"/>
-  <contrib-uptodate name="highlighter" property="highlighter.uptodate" classpath.property="highlighter.jar"/>
-  <contrib-uptodate name="memory" property="memory.uptodate" classpath.property="memory.jar"/>
-  <contrib-uptodate name="misc" property="misc.uptodate" classpath.property="misc.jar"/>
-  <contrib-uptodate name="queries-contrib" contrib-src-name="queries" property="queries-contrib.uptodate" classpath.property="queries-contrib.jar"/>
-  <contrib-uptodate name="spatial" property="spatial.uptodate" classpath.property="spatial.jar"/>
-
-  <target name="compile-analyzers-common" unless="analyzers-common.uptodate">
-  	<ant dir="${common.dir}/../modules/analysis/common" target="default" inheritAll="false">
-      <propertyset refid="uptodate.and.compiled.properties"/>
-    </ant>
-  </target>
-  <target name="compile-analyzers-phonetic" unless="analyzers-phonetic.uptodate">
-  	<ant dir="${common.dir}/../modules/analysis/phonetic" target="default" inheritAll="false">
-      <propertyset refid="uptodate.and.compiled.properties"/>
-    </ant>
-  </target>
-  <target name="compile-suggest" unless="suggest.uptodate">
-  	<ant dir="${common.dir}/../modules/suggest" target="default" inheritAll="false">
-      <propertyset refid="uptodate.and.compiled.properties"/>
-    </ant>
-  </target>
-  <target name="compile-grouping" unless="grouping.uptodate">
-  	<ant dir="${common.dir}/../modules/grouping" target="default" inheritAll="false">
-      <propertyset refid="uptodate.and.compiled.properties"/>
-    </ant>
-  </target>
-  <target name="compile-queries" unless="queries.uptodate">
-  	<ant dir="${common.dir}/../modules/queries" target="default" inheritAll="false">
-      <propertyset refid="uptodate.and.compiled.properties"/>
-    </ant>
-  </target>
-  <target name="compile-queryparser" unless="queryparser.uptodate">
-  	<ant dir="${common.dir}/../modules/queryparser" target="default" inheritAll="false">
-      <propertyset refid="uptodate.and.compiled.properties"/>
-    </ant>
-  </target>
-  <target name="compile-highlighter" unless="highlighter.uptodate">
-  	<ant dir="${common.dir}/contrib/highlighter" target="default" inheritAll="false">
-      <propertyset refid="uptodate.and.compiled.properties"/>
-    </ant>
-  </target>
-  <target name="compile-memory" unless="memory.uptodate">
-  	<ant dir="${common.dir}/contrib/memory" target="default" inheritAll="false">
-      <propertyset refid="uptodate.and.compiled.properties"/>
-    </ant>
-  </target>
-  <target name="compile-misc" unless="misc.uptodate">
-  	<ant dir="${common.dir}/contrib/misc" target="default" inheritAll="false">
-      <propertyset refid="uptodate.and.compiled.properties"/>
-    </ant>
-  </target>
-  <target name="compile-queries-contrib" unless="queries-contrib.uptodate">
-  	<ant dir="${common.dir}/contrib/queries" target="default" inheritAll="false">
-      <propertyset refid="uptodate.and.compiled.properties"/>
-    </ant>
-  </target>
-  <target name="compile-spatial" unless="spatial.uptodate">
-  	<ant dir="${common.dir}/contrib/spatial" target="default" inheritAll="false">
-      <propertyset refid="uptodate.and.compiled.properties"/>
-    </ant>
-  </target>
-  <!-- xml-query-parser contrib is required by the "luke" target -->
-  <target name="compile-xml-query-parser" unless="xml-query-parser.uptodate">
-  	<ant dir="${common.dir}/contrib/xml-query-parser" target="compile-core" inheritAll="false">
-      <propertyset refid="uptodate.and.compiled.properties"/>
-    </ant>
-  </target>
-
   <path id="solr.base.classpath">
   	<pathelement path="${analyzers-common.jar}"/>
   	<pathelement path="${analyzers-phonetic.jar}"/>
   	<pathelement path="${highlighter.jar}"/>
   	<pathelement path="${memory.jar}"/>
   	<pathelement path="${misc.jar}"/>
-  	<pathelement path="${queries-contrib.jar}"/>
   	<pathelement path="${spatial.jar}"/>
   	<pathelement path="${suggest.jar}"/>
     <pathelement path="${grouping.jar}"/>
@@ -203,7 +121,7 @@
   </macrodef>
 
   <target name="validate" depends="validate-solr"/>
-  <target name="validate-solr" depends="check-legal-solr" unless="validated-solr"/>
+  <target name="validate-solr" depends="check-legal-solr" unless="validated-solr.uptodate"/>
 
   <target name="check-legal-solr" depends="compile-tools">
     <java classname="org.apache.lucene.validation.DependencyChecker" failonerror="true" fork="true">
@@ -237,7 +155,7 @@
       <arg value="-c" />
       <arg value="${common-solr.dir}/core/src/test-files/solr/lib" />
     </java>
-    <property name="validated-solr" value="true"/>
+    <property name="validated-solr.uptodate" value="true"/>
   </target>
   <path id="tools.runtime.classpath">
     <pathelement location="${common.dir}/build/classes/tools"/>
@@ -250,18 +168,20 @@
     <mkdir dir="${maven.dist.dir}"/>
   </target>
 
-  <target name="prep-lucene-jars"
-          depends="compile-analyzers-common, compile-analyzers-phonetic, compile-suggest,
-                   compile-highlighter, compile-memory, compile-misc, compile-queries-contrib,
-                   compile-spatial, compile-grouping, compile-queries, compile-queryparser">
+  <target name="prep-lucene-jars" 
+  	      depends="jar-lucene-core, jar-analyzers-phonetic, jar-suggest, jar-highlighter, jar-memory,
+  	               jar-misc, jar-spatial, jar-grouping, jar-queries, jar-queryparser">
+  	  <property name="solr.deps.compiled" value="true"/>
+  </target>
+	
+  <target name="lucene-jars-to-solr" depends="prep-lucene-jars">
+    <!-- TODO: clean this up -->
+    <sequential>
     <ant dir="${common.dir}" target="default" inheritall="false">
       <propertyset refid="uptodate.and.compiled.properties"/>
     </ant>
-  </target>
-
-  <target name="lucene-jars-to-solr" depends="prep-lucene-jars">
     <copy todir="${lucene-libs}" preservelastmodified="true" flatten="true" failonerror="true" overwrite="true">
-      <fileset file="${common.dir}/build/lucene-core-${version}.jar" />
+      <fileset file="${lucene-core.jar}" />
       <fileset file="${analyzers-common.jar}" />
       <fileset file="${analyzers-phonetic.jar}" />
       <fileset file="${suggest.jar}" />
@@ -271,9 +191,9 @@
       <fileset file="${highlighter.jar}" />
       <fileset file="${memory.jar}" />
       <fileset file="${misc.jar}" />
-      <fileset file="${queries-contrib.jar}" />
       <fileset file="${spatial.jar}" />
     </copy>
+    </sequential>
   </target>
 
   <!-- Shared core/solrj/test-framework/contrib targets -->
@@ -285,6 +205,9 @@
             spec.version="${solr.spec.version}"/>
   </target>
 
+  <target name="compile-core" depends="prep-lucene-jars,common.compile-core"/>
+  <target name="compile-test" depends="compile-solr-test-framework,common.compile-test"/>
+
   <target name="dist" depends="jar-core">
     <copy file="${build.dir}/${fullnamever}.jar" todir="${dist}"/>
   </target>
@@ -292,6 +215,7 @@
   <target name="javadocs" depends="compile-core">
    	<sequential>
       <mkdir dir="${javadoc.dir}"/>
+      <mkdir dir="${dest}/META-INF/"/>
       <invoke-javadoc destdir="${javadoc.dir}"
                       title="${Name} ${version} ${name} API">
         <sources>
@@ -361,14 +285,18 @@
     <ant dir="${common-solr.dir}/test-framework" target="compile-core" inheritAll="false">
       <propertyset refid="uptodate.and.compiled.properties"/>
     </ant>
+  	<property name="solr.core.compiled" value="true"/>
     <property name="solr.test.framework.compiled" value="true"/>
   </target>
 
   <!-- Solr contrib targets -->
-  <target name="build-contrib" depends="compile-test"
-          description="Builds all contrib modules and their tests">
-    <contrib-crawl target="build-artifacts-and-tests"/>
+  <target name="compile-contrib" description="Compile contrib modules">
+  	<contrib-crawl target="compile-core"/>
+  </target>
+  <target name="compile-test-contrib" description="Compile contrib modules' tests">
+  	<contrib-crawl target="compile-test"/>
   </target>
+
   <target name="contribs-add-to-war">
     <mkdir dir="${dest}/web"/>
     <delete dir="${dest}/web" includes="**/*" failonerror="false"/>

Modified: lucene/dev/branches/flexscoring/solr/contrib/analysis-extras/build.xml
URL: http://svn.apache.org/viewvc/lucene/dev/branches/flexscoring/solr/contrib/analysis-extras/build.xml?rev=1160700&r1=1160699&r2=1160700&view=diff
==============================================================================
--- lucene/dev/branches/flexscoring/solr/contrib/analysis-extras/build.xml (original)
+++ lucene/dev/branches/flexscoring/solr/contrib/analysis-extras/build.xml Tue Aug 23 14:06:58 2011
@@ -25,13 +25,6 @@
 
   <import file="../contrib-build.xml"/>
 
-  <module-uptodate name="analysis/icu" jarfile="${common.dir}/../modules/analysis/build/icu/lucene-analyzers-icu-${version}.jar"
-	    property="analyzers-icu.uptodate" classpath.property="analyzers-icu.jar"/>
-  <module-uptodate name="analysis/smartcn" jarfile="${common.dir}/../modules/analysis/build/smartcn/lucene-analyzers-smartcn-${version}.jar"
-		property="analyzers-smartcn.uptodate" classpath.property="analyzers-smartcn.jar"/>
-  <module-uptodate name="analysis/stempel" jarfile="${common.dir}/../modules/analysis/build/stempel/lucene-analyzers-stempel-${version}.jar"
-		property="analyzers-stempel.uptodate" classpath.property="analyzers-stempel.jar"/>
-
   <path id="classpath">
   	<pathelement path="${analyzers-icu.jar}"/>
   	<pathelement path="${analyzers-smartcn.jar}"/>
@@ -39,26 +32,8 @@
     <path refid="solr.base.classpath"/>
   </path>
 
-  <target name="compile-analyzers-icu" unless="analyzers-icu.uptodate">
-  	<ant dir="${common.dir}/../modules/analysis/icu" target="default" inheritAll="false">
-      <propertyset refid="uptodate.and.compiled.properties"/>
-    </ant>
-  </target>
-
-  <target name="compile-analyzers-smartcn" unless="analyzers-smartcn.uptodate">
-  	<ant dir="${common.dir}/../modules/analysis/smartcn" target="default" inheritAll="false">
-      <propertyset refid="uptodate.and.compiled.properties"/>
-    </ant>
-  </target>
-
-  <target name="compile-analyzers-stempel" unless="analyzers-stempel.uptodate">
-  	<ant dir="${common.dir}/../modules/analysis/stempel" target="default" inheritAll="false">
-      <propertyset refid="uptodate.and.compiled.properties"/>
-    </ant>
-  </target>
-
   <target name="module-jars-to-solr"
-          depends="compile-analyzers-icu, compile-analyzers-smartcn, compile-analyzers-stempel">
+          depends="jar-analyzers-icu, jar-analyzers-smartcn, jar-analyzers-stempel">
     <mkdir dir="${build.dir}/lucene-libs"/>
     <copy todir="${build.dir}/lucene-libs" preservelastmodified="true" flatten="true" failonerror="true" overwrite="true">
       <fileset file="${analyzers-icu.jar}"/>
@@ -67,5 +42,6 @@
     </copy>
   </target>
 
-  <target name="compile-core" depends="module-jars-to-solr, solr-contrib-build.compile-core"/>
+  <target name="compile-core" depends="jar-analyzers-icu, jar-analyzers-smartcn, jar-analyzers-stempel, solr-contrib-build.compile-core"/>
+  <target name="dist" depends="module-jars-to-solr, common-solr.dist"/>
 </project>

Modified: lucene/dev/branches/flexscoring/solr/contrib/analysis-extras/src/test/org/apache/solr/schema/TestICUCollationField.java
URL: http://svn.apache.org/viewvc/lucene/dev/branches/flexscoring/solr/contrib/analysis-extras/src/test/org/apache/solr/schema/TestICUCollationField.java?rev=1160700&r1=1160699&r2=1160700&view=diff
==============================================================================
--- lucene/dev/branches/flexscoring/solr/contrib/analysis-extras/src/test/org/apache/solr/schema/TestICUCollationField.java (original)
+++ lucene/dev/branches/flexscoring/solr/contrib/analysis-extras/src/test/org/apache/solr/schema/TestICUCollationField.java Tue Aug 23 14:06:58 2011
@@ -74,8 +74,8 @@ public class TestICUCollationField exten
     confDir.mkdir();
     
     // copy over configuration files
-    FileUtils.copyFile(getFile("solr-analysis-extras/conf/solrconfig-icucollate.xml"), new File(confDir, "solrconfig.xml"));
-    FileUtils.copyFile(getFile("solr-analysis-extras/conf/schema-icucollate.xml"), new File(confDir, "schema.xml"));
+    FileUtils.copyFile(getFile("analysis-extras/solr/conf/solrconfig-icucollate.xml"), new File(confDir, "solrconfig.xml"));
+    FileUtils.copyFile(getFile("analysis-extras/solr/conf/schema-icucollate.xml"), new File(confDir, "schema.xml"));
     
     // generate custom collation rules (DIN 5007-2), saving to customrules.dat
     RuleBasedCollator baseCollator = (RuleBasedCollator) Collator.getInstance(new ULocale("de", "DE"));

Modified: lucene/dev/branches/flexscoring/solr/contrib/clustering/CHANGES.txt
URL: http://svn.apache.org/viewvc/lucene/dev/branches/flexscoring/solr/contrib/clustering/CHANGES.txt?rev=1160700&r1=1160699&r2=1160700&view=diff
==============================================================================
--- lucene/dev/branches/flexscoring/solr/contrib/clustering/CHANGES.txt (original)
+++ lucene/dev/branches/flexscoring/solr/contrib/clustering/CHANGES.txt Tue Aug 23 14:06:58 2011
@@ -13,7 +13,11 @@ $Id$
 
 ================== Release 3.4.0-dev ==============
 
-(No Changes)
+SOLR-2706: The carrot.lexicalResourcesDir parameter now works 
+   with absolute directories (Stanislaw Osinski)
+  
+SOLR-2692: Typo in param name fixed: "carrot.fragzise" changed to 
+  "carrot.fragSize" (Stanislaw Osinski).
 
 ================== Release 3.3.0 ==================
 

Modified: lucene/dev/branches/flexscoring/solr/contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/CarrotClusteringEngine.java
URL: http://svn.apache.org/viewvc/lucene/dev/branches/flexscoring/solr/contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/CarrotClusteringEngine.java?rev=1160700&r1=1160699&r2=1160700&view=diff
==============================================================================
--- lucene/dev/branches/flexscoring/solr/contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/CarrotClusteringEngine.java (original)
+++ lucene/dev/branches/flexscoring/solr/contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/CarrotClusteringEngine.java Tue Aug 23 14:06:58 2011
@@ -173,8 +173,8 @@ public class CarrotClusteringEngine exte
           @Override
           public IResource[] getAll(final String resource) {
             final SolrResourceLoader resourceLoader = core.getResourceLoader();
-            final String carrot2ResourcesDir = resourceLoader.getConfigDir()
-                + initParams.get(CarrotParams.LEXICAL_RESOURCES_DIR, CARROT_RESOURCES_PREFIX);
+            final String carrot2ResourcesDir = initParams.get(
+                CarrotParams.LEXICAL_RESOURCES_DIR, CARROT_RESOURCES_PREFIX);
             try {
               log.debug("Looking for " + resource + " in "
                   + carrot2ResourcesDir);
@@ -264,7 +264,7 @@ public class CarrotClusteringEngine exte
 
     SolrQueryRequest req = null;
     String[] snippetFieldAry = null;
-    if (produceSummary == true) {
+    if (produceSummary) {
       highlighter = HighlightComponent.getHighlighter(core);
       if (highlighter != null){
         Map<String, Object> args = Maps.newHashMap();

Modified: lucene/dev/branches/flexscoring/solr/contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/CarrotParams.java
URL: http://svn.apache.org/viewvc/lucene/dev/branches/flexscoring/solr/contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/CarrotParams.java?rev=1160700&r1=1160699&r2=1160700&view=diff
==============================================================================
--- lucene/dev/branches/flexscoring/solr/contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/CarrotParams.java (original)
+++ lucene/dev/branches/flexscoring/solr/contrib/clustering/src/java/org/apache/solr/handler/clustering/carrot2/CarrotParams.java Tue Aug 23 14:06:58 2011
@@ -33,7 +33,7 @@ public interface CarrotParams {
   String PRODUCE_SUMMARY = CARROT_PREFIX + "produceSummary";
   String NUM_DESCRIPTIONS = CARROT_PREFIX + "numDescriptions";
   String OUTPUT_SUB_CLUSTERS = CARROT_PREFIX + "outputSubClusters";
-  String SUMMARY_FRAGSIZE = CARROT_PREFIX + "fragzise";
+  String SUMMARY_FRAGSIZE = CARROT_PREFIX + "fragSize";
 
   String LEXICAL_RESOURCES_DIR = CARROT_PREFIX + "lexicalResourcesDir";
 

Modified: lucene/dev/branches/flexscoring/solr/contrib/clustering/src/test/org/apache/solr/handler/clustering/AbstractClusteringTestCase.java
URL: http://svn.apache.org/viewvc/lucene/dev/branches/flexscoring/solr/contrib/clustering/src/test/org/apache/solr/handler/clustering/AbstractClusteringTestCase.java?rev=1160700&r1=1160699&r2=1160700&view=diff
==============================================================================
--- lucene/dev/branches/flexscoring/solr/contrib/clustering/src/test/org/apache/solr/handler/clustering/AbstractClusteringTestCase.java (original)
+++ lucene/dev/branches/flexscoring/solr/contrib/clustering/src/test/org/apache/solr/handler/clustering/AbstractClusteringTestCase.java Tue Aug 23 14:06:58 2011
@@ -28,7 +28,7 @@ public abstract class AbstractClustering
 
   @BeforeClass
   public static void beforeClass() throws Exception {
-    initCore("solrconfig.xml", "schema.xml", "solr-clustering");
+    initCore("solrconfig.xml", "schema.xml", "clustering/solr");
     numberOfDocs = 0;
     for (String[] doc : DOCUMENTS) {
       assertNull(h.validateUpdate(adoc("id", Integer.toString(numberOfDocs), "url", doc[0], "title", doc[1], "snippet", doc[2])));

Modified: lucene/dev/branches/flexscoring/solr/contrib/clustering/src/test/org/apache/solr/handler/clustering/DistributedClusteringComponentTest.java
URL: http://svn.apache.org/viewvc/lucene/dev/branches/flexscoring/solr/contrib/clustering/src/test/org/apache/solr/handler/clustering/DistributedClusteringComponentTest.java?rev=1160700&r1=1160699&r2=1160700&view=diff
==============================================================================
--- lucene/dev/branches/flexscoring/solr/contrib/clustering/src/test/org/apache/solr/handler/clustering/DistributedClusteringComponentTest.java (original)
+++ lucene/dev/branches/flexscoring/solr/contrib/clustering/src/test/org/apache/solr/handler/clustering/DistributedClusteringComponentTest.java Tue Aug 23 14:06:58 2011
@@ -25,8 +25,7 @@ public class DistributedClusteringCompon
 
   @Override
   public String getSolrHome() {
-    // TODO: this should work with just "solr-clustering"...
-    return getFile("solr-clustering").getAbsolutePath();
+    return "clustering/solr";
   }
 
   @Override

Modified: lucene/dev/branches/flexscoring/solr/contrib/clustering/src/test/org/apache/solr/handler/clustering/carrot2/CarrotClusteringEngineTest.java
URL: http://svn.apache.org/viewvc/lucene/dev/branches/flexscoring/solr/contrib/clustering/src/test/org/apache/solr/handler/clustering/carrot2/CarrotClusteringEngineTest.java?rev=1160700&r1=1160699&r2=1160700&view=diff
==============================================================================
--- lucene/dev/branches/flexscoring/solr/contrib/clustering/src/test/org/apache/solr/handler/clustering/carrot2/CarrotClusteringEngineTest.java (original)
+++ lucene/dev/branches/flexscoring/solr/contrib/clustering/src/test/org/apache/solr/handler/clustering/carrot2/CarrotClusteringEngineTest.java Tue Aug 23 14:06:58 2011
@@ -58,14 +58,52 @@ public class CarrotClusteringEngineTest 
 
   @Test
   public void testProduceSummary() throws Exception {
-    ModifiableSolrParams solrParams = new ModifiableSolrParams();
-    solrParams.add(CarrotParams.SNIPPET_FIELD_NAME, "snippet");
-    solrParams.add(CarrotParams.SUMMARY_FRAGSIZE, "200");//how do we validate this?
+    // We'll make two queries, one with- and another one without summary
+    // and assert that documents are shorter when highlighter is in use.
+    final List<NamedList<Object>> noSummaryClusters = clusterWithHighlighting(false, 80);
+    final List<NamedList<Object>> summaryClusters = clusterWithHighlighting(true, 80);
+
+    assertEquals("Equal number of clusters", noSummaryClusters.size(), summaryClusters.size());
+    for (int i = 0; i < noSummaryClusters.size(); i++) {
+      assertTrue("Summary shorter than original document", 
+          getLabels(noSummaryClusters.get(i)).get(1).length() > 
+          getLabels(summaryClusters.get(i)).get(1).length()); 
+    }
+  }
+  
+  @Test
+  public void testSummaryFragSize() throws Exception {
+    // We'll make two queries, one short summaries and another one with longer
+    // summaries and will check that the results differ.
+    final List<NamedList<Object>> shortSummaryClusters = clusterWithHighlighting(true, 30);
+    final List<NamedList<Object>> longSummaryClusters = clusterWithHighlighting(true, 80);
+    
+    assertEquals("Equal number of clusters", shortSummaryClusters.size(), longSummaryClusters.size());
+    for (int i = 0; i < shortSummaryClusters.size(); i++) {
+      assertTrue("Summary shorter than original document", 
+          getLabels(shortSummaryClusters.get(i)).get(1).length() < 
+      getLabels(longSummaryClusters.get(i)).get(1).length()); 
+    }
+  }
+
+  private List<NamedList<Object>> clusterWithHighlighting(
+      boolean enableHighlighting, int fragSize) throws IOException {
+    
+    final TermQuery query = new TermQuery(new Term("snippet", "mine"));
+    // Two documents don't have mining in the snippet
+    int expectedNumDocuments = numberOfDocs - 2;
+
+    final ModifiableSolrParams summaryParams = new ModifiableSolrParams();
+    summaryParams.add(CarrotParams.SNIPPET_FIELD_NAME, "snippet");
+    summaryParams.add(CarrotParams.PRODUCE_SUMMARY,
+        Boolean.toString(enableHighlighting));
+    summaryParams
+        .add(CarrotParams.SUMMARY_FRAGSIZE, Integer.toString(fragSize));
+    final List<NamedList<Object>> summaryClusters = checkEngine(
+        getClusteringEngine("echo"), expectedNumDocuments,
+        expectedNumDocuments, query, summaryParams);
     
-  	// Note: the expected number of clusters may change after upgrading Carrot2
-  	// due to e.g. internal improvements or tuning of Carrot2 clustering.
-    final int expectedNumClusters = 15;
-    checkEngine(getClusteringEngine("default"), numberOfDocs -2 /*two don't have mining in the snippet*/, expectedNumClusters, new TermQuery(new Term("snippet", "mine")), solrParams);
+    return summaryClusters;
   }
 
   @Test
@@ -152,7 +190,7 @@ public class CarrotClusteringEngineTest 
 				wordsToCheck);
 
 		// "customsolrstopword" is in stopwords.en, "customsolrstoplabel" is in
-		// stoplabels.en, so we're expecting only one cluster with label "online".
+		// stoplabels.mt, so we're expecting only one cluster with label "online".
 		final List<NamedList<Object>> clusters = checkEngine(
 				getClusteringEngine(engineName), 1, params);
 		assertEquals(getLabels(clusters.get(0)), ImmutableList.of("online"));
@@ -227,7 +265,6 @@ public class CarrotClusteringEngineTest 
       assertEquals("docList size", expectedNumDocs, docList.matches());
 
       ModifiableSolrParams solrParams = new ModifiableSolrParams();
-      solrParams.add(CarrotParams.PRODUCE_SUMMARY, "true");
       solrParams.add(clusteringParams);
 
       // Perform clustering

Modified: lucene/dev/branches/flexscoring/solr/contrib/contrib-build.xml
URL: http://svn.apache.org/viewvc/lucene/dev/branches/flexscoring/solr/contrib/contrib-build.xml?rev=1160700&r1=1160699&r2=1160700&view=diff
==============================================================================
--- lucene/dev/branches/flexscoring/solr/contrib/contrib-build.xml (original)
+++ lucene/dev/branches/flexscoring/solr/contrib/contrib-build.xml Tue Aug 23 14:06:58 2011
@@ -23,14 +23,7 @@
 
   <import file="../common-build.xml"/>
 
-  <target name="build-solr" unless="solr.core.compiled">
-    <ant dir="${common-solr.dir}/core" target="compile-test" inheritAll="false">
-      <propertyset refid="uptodate.and.compiled.properties"/>
-    </ant>
-    <property name="solr.core.compiled" value="true"/>
-  </target>
-
-  <target name="compile-core" depends="build-solr, common.compile-core"/>
+  <target name="compile-core" depends="compile-solr-core,compile-solrj,common-solr.compile-core"/>
 
   <!-- redefine common-solr.test, and exclude 'validate-solr' dependency, since it should only run at solr/ level -->
   <target name="test" depends="compile-test,junit-mkdir,junit-sequential,junit-parallel" description="Runs unit tests"/>

Modified: lucene/dev/branches/flexscoring/solr/contrib/dataimporthandler-extras/build.xml
URL: http://svn.apache.org/viewvc/lucene/dev/branches/flexscoring/solr/contrib/dataimporthandler-extras/build.xml?rev=1160700&r1=1160699&r2=1160700&view=diff
==============================================================================
--- lucene/dev/branches/flexscoring/solr/contrib/dataimporthandler-extras/build.xml (original)
+++ lucene/dev/branches/flexscoring/solr/contrib/dataimporthandler-extras/build.xml Tue Aug 23 14:06:58 2011
@@ -30,7 +30,7 @@
                          classpath.property="solr-dataimporthandler.jar"/>
 
   <target name="compile-solr-dataimporthandler" unless="solr-dataimporthandler.uptodate">
-  	<ant dir="${common-solr.dir}/contrib/dataimporthandler" target="default" inheritAll="false">
+  	<ant dir="${common-solr.dir}/contrib/dataimporthandler" target="compile-core" inheritAll="false">
       <propertyset refid="uptodate.and.compiled.properties"/>
     </ant>
   </target>
@@ -54,5 +54,5 @@
   </path>
 
   <target name="compile-core" depends="compile-solr-dataimporthandler, solr-contrib-build.compile-core"/>
-  <target name="compile-test" depends="compile-solr-dataimporthandler-tests, contrib-build.compile-test"/>
+  <target name="compile-test" depends="compile-solr-dataimporthandler-tests, common-solr.compile-test"/>
 </project>

Modified: lucene/dev/branches/flexscoring/solr/contrib/dataimporthandler-extras/src/test/org/apache/solr/handler/dataimport/TestMailEntityProcessor.java
URL: http://svn.apache.org/viewvc/lucene/dev/branches/flexscoring/solr/contrib/dataimporthandler-extras/src/test/org/apache/solr/handler/dataimport/TestMailEntityProcessor.java?rev=1160700&r1=1160699&r2=1160700&view=diff
==============================================================================
--- lucene/dev/branches/flexscoring/solr/contrib/dataimporthandler-extras/src/test/org/apache/solr/handler/dataimport/TestMailEntityProcessor.java (original)
+++ lucene/dev/branches/flexscoring/solr/contrib/dataimporthandler-extras/src/test/org/apache/solr/handler/dataimport/TestMailEntityProcessor.java Tue Aug 23 14:06:58 2011
@@ -188,7 +188,7 @@ public class TestMailEntityProcessor ext
     Boolean commitCalled;
 
     public SolrWriterImpl() {
-      super(null, ".", null);
+      super(null, null);
     }
 
     @Override
@@ -196,10 +196,6 @@ public class TestMailEntityProcessor ext
       return docs.add(doc);
     }
 
-    @Override
-    public void log(int event, String name, Object row) {
-      // Do nothing
-    }
 
     @Override
     public void doDeleteAll() {

Modified: lucene/dev/branches/flexscoring/solr/contrib/dataimporthandler-extras/src/test/org/apache/solr/handler/dataimport/TestTikaEntityProcessor.java
URL: http://svn.apache.org/viewvc/lucene/dev/branches/flexscoring/solr/contrib/dataimporthandler-extras/src/test/org/apache/solr/handler/dataimport/TestTikaEntityProcessor.java?rev=1160700&r1=1160699&r2=1160700&view=diff
==============================================================================
--- lucene/dev/branches/flexscoring/solr/contrib/dataimporthandler-extras/src/test/org/apache/solr/handler/dataimport/TestTikaEntityProcessor.java (original)
+++ lucene/dev/branches/flexscoring/solr/contrib/dataimporthandler-extras/src/test/org/apache/solr/handler/dataimport/TestTikaEntityProcessor.java Tue Aug 23 14:06:58 2011
@@ -40,7 +40,7 @@ public class TestTikaEntityProcessor ext
   "<dataConfig>" +
   "  <dataSource type=\"BinFileDataSource\"/>" +
   "  <document>" +
-  "    <entity processor=\"TikaEntityProcessor\" url=\"" + getFile("solr-word.pdf").getAbsolutePath() + "\" >" +
+  "    <entity processor=\"TikaEntityProcessor\" url=\"" + getFile("dihextras/solr-word.pdf").getAbsolutePath() + "\" >" +
   "      <field column=\"Author\" meta=\"true\" name=\"author\"/>" +
   "      <field column=\"title\" meta=\"true\" name=\"title\"/>" +
   "      <field column=\"text\"/>" +
@@ -58,7 +58,7 @@ public class TestTikaEntityProcessor ext
 
   @BeforeClass
   public static void beforeClass() throws Exception {
-    initCore("dataimport-solrconfig.xml", "dataimport-schema-no-unique-key.xml", getFile("solr-dihextras").getAbsolutePath());
+    initCore("dataimport-solrconfig.xml", "dataimport-schema-no-unique-key.xml", getFile("dihextras/solr").getAbsolutePath());
   }
 
   @Test

Modified: lucene/dev/branches/flexscoring/solr/contrib/dataimporthandler/CHANGES.txt
URL: http://svn.apache.org/viewvc/lucene/dev/branches/flexscoring/solr/contrib/dataimporthandler/CHANGES.txt?rev=1160700&r1=1160699&r2=1160700&view=diff
==============================================================================
--- lucene/dev/branches/flexscoring/solr/contrib/dataimporthandler/CHANGES.txt (original)
+++ lucene/dev/branches/flexscoring/solr/contrib/dataimporthandler/CHANGES.txt Tue Aug 23 14:06:58 2011
@@ -19,6 +19,8 @@ Bug Fixes
 * SOLR-2644: When using threads=2 the default logging is set too high (Bill Bell via shalin)
 * SOLR-2492: DIH does not commit if only deletes are processed (James Dyer via shalin)
 * SOLR-2186: DataImportHandler's multi-threaded option throws NPE (Lance Norskog, Frank Wesemann, shalin)
+* SOLR-2655: DIH multi threaded mode does not resolve attributes correctly (Frank Wesemann, shalin)
+* SOLR-2695: Documents are collected in unsynchronized list in multi-threaded debug mode (Michael McCandless, shalin)
 
 ==================  3.3.0 ==================
 

Modified: lucene/dev/branches/flexscoring/solr/contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/ContextImpl.java
URL: http://svn.apache.org/viewvc/lucene/dev/branches/flexscoring/solr/contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/ContextImpl.java?rev=1160700&r1=1160699&r2=1160700&view=diff
==============================================================================
--- lucene/dev/branches/flexscoring/solr/contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/ContextImpl.java (original)
+++ lucene/dev/branches/flexscoring/solr/contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/ContextImpl.java Tue Aug 23 14:06:58 2011
@@ -100,7 +100,7 @@ public class ContextImpl extends Context
     if (entity.dataSrc != null && docBuilder != null && docBuilder.verboseDebug &&
              Context.FULL_DUMP.equals(currentProcess())) {
       //debug is not yet implemented properly for deltas
-      entity.dataSrc = docBuilder.writer.getDebugLogger().wrapDs(entity.dataSrc);
+      entity.dataSrc = docBuilder.getDebugLogger().wrapDs(entity.dataSrc);
     }
     return entity.dataSrc;
   }

Modified: lucene/dev/branches/flexscoring/solr/contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImportHandler.java
URL: http://svn.apache.org/viewvc/lucene/dev/branches/flexscoring/solr/contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImportHandler.java?rev=1160700&r1=1160699&r2=1160700&view=diff
==============================================================================
--- lucene/dev/branches/flexscoring/solr/contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImportHandler.java (original)
+++ lucene/dev/branches/flexscoring/solr/contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImportHandler.java Tue Aug 23 14:06:58 2011
@@ -74,8 +74,6 @@ public class DataImportHandler extends R
 
   private Map<String, Properties> dataSources = new HashMap<String, Properties>();
 
-  private List<SolrInputDocument> debugDocuments;
-
   private boolean debugEnabled = true;
 
   private String myName = "dataimport";
@@ -113,7 +111,7 @@ public class DataImportHandler extends R
           final InputSource is = new InputSource(core.getResourceLoader().openConfig(configLoc));
           is.setSystemId(SystemIdResolver.createSystemIdFromResourceName(configLoc));
           importer = new DataImporter(is, core,
-                  dataSources, coreScopeSession);
+                  dataSources, coreScopeSession, myName);
         }
       }
     } catch (Throwable e) {
@@ -165,7 +163,7 @@ public class DataImportHandler extends R
         try {
           processConfiguration((NamedList) initArgs.get("defaults"));
           importer = new DataImporter(new InputSource(new StringReader(requestParams.dataConfig)), req.getCore()
-                  , dataSources, coreScopeSession);
+                  , dataSources, coreScopeSession, myName);
         } catch (RuntimeException e) {
           rsp.add("exception", DebugLogger.getStacktraceString(e));
           importer = null;
@@ -197,16 +195,18 @@ public class DataImportHandler extends R
         UpdateRequestProcessor processor = processorChain.createProcessor(req, rsp);
         SolrResourceLoader loader = req.getCore().getResourceLoader();
         SolrWriter sw = getSolrWriter(processor, loader, requestParams, req);
-
+        
         if (requestParams.debug) {
           if (debugEnabled) {
             // Synchronous request for the debug mode
             importer.runCmd(requestParams, sw);
             rsp.add("mode", "debug");
-            rsp.add("documents", debugDocuments);
-            if (sw.debugLogger != null)
-              rsp.add("verbose-output", sw.debugLogger.output);
-            debugDocuments = null;
+            rsp.add("documents", requestParams.debugDocuments);
+            if (requestParams.debugVerboseOutput != null) {
+            	rsp.add("verbose-output", requestParams.debugVerboseOutput);
+            }
+            requestParams.debugDocuments = new ArrayList<SolrInputDocument>(0);
+            requestParams.debugVerboseOutput = null;
           } else {
             message = DataImporter.MSG.DEBUG_NOT_ENABLED;
           }
@@ -215,7 +215,7 @@ public class DataImportHandler extends R
           if(requestParams.contentStream == null && !requestParams.syncMode){
             importer.runAsync(requestParams, sw);
           } else {
-              importer.runCmd(requestParams, sw);
+            importer.runCmd(requestParams, sw);
           }
         }
       } else if (DataImporter.RELOAD_CONF_CMD.equals(command)) {
@@ -280,16 +280,11 @@ public class DataImportHandler extends R
   private SolrWriter getSolrWriter(final UpdateRequestProcessor processor,
                                    final SolrResourceLoader loader, final DataImporter.RequestParams requestParams, SolrQueryRequest req) {
 
-    return new SolrWriter(processor, loader.getConfigDir(), myName, req) {
+    return new SolrWriter(processor, req) {
 
       @Override
       public boolean upload(SolrInputDocument document) {
         try {
-          if (requestParams.debug) {
-            if (debugDocuments == null)
-              debugDocuments = new ArrayList<SolrInputDocument>();
-            debugDocuments.add(document);
-          }
           return super.upload(document);
         } catch (RuntimeException e) {
           LOG.error( "Exception while adding: " + document, e);

Modified: lucene/dev/branches/flexscoring/solr/contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImporter.java
URL: http://svn.apache.org/viewvc/lucene/dev/branches/flexscoring/solr/contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImporter.java?rev=1160700&r1=1160699&r2=1160700&view=diff
==============================================================================
--- lucene/dev/branches/flexscoring/solr/contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImporter.java (original)
+++ lucene/dev/branches/flexscoring/solr/contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImporter.java Tue Aug 23 14:06:58 2011
@@ -18,11 +18,13 @@
 package org.apache.solr.handler.dataimport;
 
 import org.apache.solr.common.SolrException;
+import org.apache.solr.common.SolrInputDocument;
 import org.apache.solr.core.SolrConfig;
 import org.apache.solr.core.SolrCore;
 import org.apache.solr.schema.IndexSchema;
 import org.apache.solr.schema.SchemaField;
 import org.apache.solr.common.util.ContentStream;
+import org.apache.solr.common.util.NamedList;
 import org.apache.solr.common.util.StrUtils;
 import org.apache.solr.common.util.SystemIdResolver;
 import org.apache.solr.common.util.XMLErrorLogger;
@@ -39,7 +41,6 @@ import org.apache.commons.io.IOUtils;
 
 import javax.xml.parsers.DocumentBuilder;
 import javax.xml.parsers.DocumentBuilderFactory;
-import java.io.File;
 import java.io.StringReader;
 import java.text.SimpleDateFormat;
 import java.util.*;
@@ -80,26 +81,35 @@ public class DataImporter {
   public DocBuilder.Statistics cumulativeStatistics = new DocBuilder.Statistics();
 
   private SolrCore core;
+  
+  private DIHPropertiesWriter propWriter;
 
   private ReentrantLock importLock = new ReentrantLock();
 
   private final Map<String , Object> coreScopeSession;
 
   private boolean isDeltaImportSupported = false;
+  private final String handlerName;
 
   /**
    * Only for testing purposes
    */
   DataImporter() {
     coreScopeSession = new ConcurrentHashMap<String, Object>();
+    this.propWriter = new SimplePropertiesWriter();
+    propWriter.init(this);
+    this.handlerName = "dataimport" ;
   }
 
-  DataImporter(InputSource dataConfig, SolrCore core, Map<String, Properties> ds, Map<String, Object> session) {
+  DataImporter(InputSource dataConfig, SolrCore core, Map<String, Properties> ds, Map<String, Object> session, String handlerName) {
+      this.handlerName = handlerName;
     if (dataConfig == null)
       throw new DataImportHandlerException(SEVERE,
               "Configuration not found");
     this.core = core;
     this.schema = core.getSchema();
+    this.propWriter = new SimplePropertiesWriter();
+    propWriter.init(this);
     dataSourceProps = ds;
     if (session == null)
       session = new HashMap<String, Object>();
@@ -120,7 +130,11 @@ public class DataImporter {
     }
   }
 
-  private void verifyWithSchema(Map<String, DataConfig.Field> fields) {
+   public String getHandlerName() {
+        return handlerName;
+    }
+
+    private void verifyWithSchema(Map<String, DataConfig.Field> fields) {
     Map<String, SchemaField> schemaFields = schema.getFields();
     for (Map.Entry<String, SchemaField> entry : schemaFields.entrySet()) {
       SchemaField sf = entry.getValue();
@@ -353,7 +367,7 @@ public class DataImporter {
     setIndexStartTime(new Date());
 
     try {
-      docBuilder = new DocBuilder(this, writer, requestParams);
+      docBuilder = new DocBuilder(this, writer, propWriter, requestParams);
       checkWritablePersistFile(writer);
       docBuilder.execute();
       if (!requestParams.debug)
@@ -370,11 +384,11 @@ public class DataImporter {
   }
 
   private void checkWritablePersistFile(SolrWriter writer) {
-    File persistFile = writer.getPersistFile();
-    boolean isWritable = persistFile.exists() ? persistFile.canWrite() : persistFile.getParentFile().canWrite();
-    if (isDeltaImportSupported && !isWritable) {
-      throw new DataImportHandlerException(SEVERE, persistFile.getAbsolutePath() +
-          " is not writable. Delta imports are supported by data config but will not work.");
+//  	File persistFile = propWriter.getPersistFile();
+//    boolean isWritable = persistFile.exists() ? persistFile.canWrite() : persistFile.getParentFile().canWrite();
+    if (isDeltaImportSupported && !propWriter.isWritable()) {
+      throw new DataImportHandlerException(SEVERE,
+          "Properties is not writable. Delta imports are supported by data config but will not work.");
     }
   }
 
@@ -384,7 +398,7 @@ public class DataImporter {
 
     try {
       setIndexStartTime(new Date());
-      docBuilder = new DocBuilder(this, writer, requestParams);
+      docBuilder = new DocBuilder(this, writer, propWriter, requestParams);
       checkWritablePersistFile(writer);
       docBuilder.execute();
       if (!requestParams.debug)
@@ -503,7 +517,7 @@ public class DataImporter {
     public String command = null;
 
     public boolean debug = false;
-
+    
     public boolean verbose = false;
 
     public boolean syncMode = false;
@@ -525,6 +539,10 @@ public class DataImporter {
     public String dataConfig;
 
     public ContentStream contentStream;
+    
+    public List<SolrInputDocument> debugDocuments = Collections.synchronizedList(new ArrayList<SolrInputDocument>());
+    
+    public NamedList debugVerboseOutput = null;
 
     public RequestParams() {
     }

Modified: lucene/dev/branches/flexscoring/solr/contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DebugLogger.java
URL: http://svn.apache.org/viewvc/lucene/dev/branches/flexscoring/solr/contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DebugLogger.java?rev=1160700&r1=1160699&r2=1160700&view=diff
==============================================================================
--- lucene/dev/branches/flexscoring/solr/contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DebugLogger.java (original)
+++ lucene/dev/branches/flexscoring/solr/contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DebugLogger.java Tue Aug 23 14:06:58 2011
@@ -45,7 +45,7 @@ class DebugLogger {
   private Stack<DebugInfo> debugStack;
 
   NamedList output;
-  private final SolrWriter writer;
+//  private final SolrWriter writer1;
 
   private static final String LINE = "---------------------------------------------";
 
@@ -54,8 +54,8 @@ class DebugLogger {
 
   boolean enabled = true;
 
-  public DebugLogger(SolrWriter solrWriter) {
-    writer = solrWriter;
+  public DebugLogger() {
+//    writer = solrWriter;
     output = new NamedList();
     debugStack = new Stack<DebugInfo>() {
 
@@ -67,7 +67,7 @@ class DebugLogger {
         return super.pop();
       }
     };
-    debugStack.push(new DebugInfo(null, -1, null));
+    debugStack.push(new DebugInfo(null, DIHLogLevels.NONE, null));
     output = debugStack.peek().lst;
   }
 
@@ -75,47 +75,47 @@ class DebugLogger {
     return debugStack.isEmpty() ? null : debugStack.peek();
   }
 
-  public void log(int event, String name, Object row) {
-    if (event == SolrWriter.DISABLE_LOGGING) {
+  public void log(DIHLogLevels event, String name, Object row) {
+    if (event == DIHLogLevels.DISABLE_LOGGING) {
       enabled = false;
       return;
-    } else if (event == SolrWriter.ENABLE_LOGGING) {
+    } else if (event == DIHLogLevels.ENABLE_LOGGING) {
       enabled = true;
       return;
     }
 
-    if (!enabled && event != SolrWriter.START_ENTITY
-            && event != SolrWriter.END_ENTITY) {
+    if (!enabled && event != DIHLogLevels.START_ENTITY
+            && event != DIHLogLevels.END_ENTITY) {
       return;
     }
 
-    if (event == SolrWriter.START_DOC) {
-      debugStack.push(new DebugInfo(null, SolrWriter.START_DOC, peekStack()));
-    } else if (SolrWriter.START_ENTITY == event) {
+    if (event == DIHLogLevels.START_DOC) {
+      debugStack.push(new DebugInfo(null, DIHLogLevels.START_DOC, peekStack()));
+    } else if (DIHLogLevels.START_ENTITY == event) {
       debugStack
-              .push(new DebugInfo(name, SolrWriter.START_ENTITY, peekStack()));
-    } else if (SolrWriter.ENTITY_OUT == event
-            || SolrWriter.PRE_TRANSFORMER_ROW == event) {
-      if (debugStack.peek().type == SolrWriter.START_ENTITY
-              || debugStack.peek().type == SolrWriter.START_DOC) {
+              .push(new DebugInfo(name, DIHLogLevels.START_ENTITY, peekStack()));
+    } else if (DIHLogLevels.ENTITY_OUT == event
+            || DIHLogLevels.PRE_TRANSFORMER_ROW == event) {
+      if (debugStack.peek().type == DIHLogLevels.START_ENTITY
+              || debugStack.peek().type == DIHLogLevels.START_DOC) {
         debugStack.peek().lst.add(null, fmt.format(new Object[]{++debugStack
                 .peek().rowCount}));
         addToNamedList(debugStack.peek().lst, row);
         debugStack.peek().lst.add(null, LINE);
       }
-    } else if (event == SolrWriter.ROW_END) {
+    } else if (event == DIHLogLevels.ROW_END) {
       popAllTransformers();
-    } else if (SolrWriter.END_ENTITY == event) {
-      while (debugStack.pop().type != SolrWriter.START_ENTITY)
+    } else if (DIHLogLevels.END_ENTITY == event) {
+      while (debugStack.pop().type != DIHLogLevels.START_ENTITY)
         ;
-    } else if (SolrWriter.END_DOC == event) {
-      while (debugStack.pop().type != SolrWriter.START_DOC)
+    } else if (DIHLogLevels.END_DOC == event) {
+      while (debugStack.pop().type != DIHLogLevels.START_DOC)
         ;
-    } else if (event == SolrWriter.TRANSFORMER_EXCEPTION) {
+    } else if (event == DIHLogLevels.TRANSFORMER_EXCEPTION) {
       debugStack.push(new DebugInfo(name, event, peekStack()));
       debugStack.peek().lst.add("EXCEPTION",
               getStacktraceString((Exception) row));
-    } else if (SolrWriter.TRANSFORMED_ROW == event) {
+    } else if (DIHLogLevels.TRANSFORMED_ROW == event) {
       debugStack.push(new DebugInfo(name, event, peekStack()));
       debugStack.peek().lst.add(null, LINE);
       addToNamedList(debugStack.peek().lst, row);
@@ -124,10 +124,10 @@ class DebugLogger {
         DataImportHandlerException dataImportHandlerException = (DataImportHandlerException) row;
         dataImportHandlerException.debugged = true;
       }
-    } else if (SolrWriter.ENTITY_META == event) {
+    } else if (DIHLogLevels.ENTITY_META == event) {
       popAllTransformers();
       debugStack.peek().lst.add(name, row);
-    } else if (SolrWriter.ENTITY_EXCEPTION == event) {
+    } else if (DIHLogLevels.ENTITY_EXCEPTION == event) {
       if (row instanceof DataImportHandlerException) {
         DataImportHandlerException dihe = (DataImportHandlerException) row;
         if (dihe.debugged)
@@ -143,8 +143,8 @@ class DebugLogger {
 
   private void popAllTransformers() {
     while (true) {
-      int type = debugStack.peek().type;
-      if (type == SolrWriter.START_DOC || type == SolrWriter.START_ENTITY)
+    	DIHLogLevels type = debugStack.peek().type;
+      if (type == DIHLogLevels.START_DOC || type == DIHLogLevels.START_ENTITY)
         break;
       debugStack.pop();
     }
@@ -181,23 +181,23 @@ class DebugLogger {
 
       @Override
       public Object getData(String query) {
-        writer.log(SolrWriter.ENTITY_META, "query", query);
+        log(DIHLogLevels.ENTITY_META, "query", query);
         long start = System.currentTimeMillis();
         try {
           return ds.getData(query);
         } catch (DataImportHandlerException de) {
-          writer.log(SolrWriter.ENTITY_EXCEPTION,
+          log(DIHLogLevels.ENTITY_EXCEPTION,
                   null, de);
           throw de;
         } catch (Exception e) {
-          writer.log(SolrWriter.ENTITY_EXCEPTION,
+          log(DIHLogLevels.ENTITY_EXCEPTION,
                   null, e);
           DataImportHandlerException de = new DataImportHandlerException(
                   DataImportHandlerException.SEVERE, "", e);
           de.debugged = true;
           throw de;
         } finally {
-          writer.log(SolrWriter.ENTITY_META, "time-taken", DocBuilder
+          log(DIHLogLevels.ENTITY_META, "time-taken", DocBuilder
                   .getTimeElapsedSince(start));
         }
       }
@@ -208,18 +208,18 @@ class DebugLogger {
     return new Transformer() {
       @Override
       public Object transformRow(Map<String, Object> row, Context context) {
-        writer.log(SolrWriter.PRE_TRANSFORMER_ROW, null, row);
+        log(DIHLogLevels.PRE_TRANSFORMER_ROW, null, row);
         String tName = getTransformerName(t);
         Object result = null;
         try {
           result = t.transformRow(row, context);
-          writer.log(SolrWriter.TRANSFORMED_ROW, tName, result);
+          log(DIHLogLevels.TRANSFORMED_ROW, tName, result);
         } catch (DataImportHandlerException de) {
-          writer.log(SolrWriter.TRANSFORMER_EXCEPTION, tName, de);
+          log(DIHLogLevels.TRANSFORMER_EXCEPTION, tName, de);
           de.debugged = true;
           throw de;
         } catch (Exception e) {
-          writer.log(SolrWriter.TRANSFORMER_EXCEPTION, tName, e);
+          log(DIHLogLevels.TRANSFORMER_EXCEPTION, tName, e);
           DataImportHandlerException de = new DataImportHandlerException(DataImportHandlerException.SEVERE, "", e);
           de.debugged = true;
           throw de;
@@ -258,23 +258,23 @@ class DebugLogger {
 
     NamedList lst;
 
-    int type;
+    DIHLogLevels type;
 
     DebugInfo parent;
 
-    public DebugInfo(String name, int type, DebugInfo parent) {
+    public DebugInfo(String name, DIHLogLevels type, DebugInfo parent) {
       this.name = name;
       this.type = type;
       this.parent = parent;
       lst = new NamedList();
       if (parent != null) {
         String displayName = null;
-        if (type == SolrWriter.START_ENTITY) {
+        if (type == DIHLogLevels.START_ENTITY) {
           displayName = "entity:" + name;
-        } else if (type == SolrWriter.TRANSFORMED_ROW
-                || type == SolrWriter.TRANSFORMER_EXCEPTION) {
+        } else if (type == DIHLogLevels.TRANSFORMED_ROW
+                || type == DIHLogLevels.TRANSFORMER_EXCEPTION) {
           displayName = "transformer:" + name;
-        } else if (type == SolrWriter.START_DOC) {
+        } else if (type == DIHLogLevels.START_DOC) {
           this.name = displayName = "document#" + SolrWriter.getDocCount();
         }
         parent.lst.add(displayName, lst);

Modified: lucene/dev/branches/flexscoring/solr/contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java
URL: http://svn.apache.org/viewvc/lucene/dev/branches/flexscoring/solr/contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java?rev=1160700&r1=1160699&r2=1160700&view=diff
==============================================================================
--- lucene/dev/branches/flexscoring/solr/contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java (original)
+++ lucene/dev/branches/flexscoring/solr/contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DocBuilder.java Tue Aug 23 14:06:58 2011
@@ -56,33 +56,69 @@ public class DocBuilder {
 
   public Statistics importStatistics = new Statistics();
 
-  SolrWriter writer;
+  DIHWriter writer;
 
   DataImporter.RequestParams requestParameters;
 
   boolean verboseDebug = false;
 
-   Map<String, Object> session = new ConcurrentHashMap<String, Object>();
+  Map<String, Object> session = new ConcurrentHashMap<String, Object>();
 
   static final ThreadLocal<DocBuilder> INSTANCE = new ThreadLocal<DocBuilder>();
   Map<String, Object> functionsNamespace;
   private Properties persistedProperties;
-
-  public DocBuilder(DataImporter dataImporter, SolrWriter writer, DataImporter.RequestParams reqParams) {
+  
+  private DIHPropertiesWriter propWriter;
+  private static final String PARAM_WRITER_IMPL = "writerImpl";
+  private static final String DEFAULT_WRITER_NAME = "SolrWriter";
+  private DebugLogger debugLogger;
+  private DataImporter.RequestParams reqParams;
+  
+    @SuppressWarnings("unchecked")
+  public DocBuilder(DataImporter dataImporter, SolrWriter solrWriter, DIHPropertiesWriter propWriter, DataImporter.RequestParams reqParams) {
     INSTANCE.set(this);
     this.dataImporter = dataImporter;
-    this.writer = writer;
+    this.reqParams = reqParams;
+    this.propWriter = propWriter;
     DataImporter.QUERY_COUNT.set(importStatistics.queryCount);
     requestParameters = reqParams;
     verboseDebug = requestParameters.debug && requestParameters.verbose;
     functionsNamespace = EvaluatorBag.getFunctionsNamespace(this.dataImporter.getConfig().functions, this);
-    persistedProperties = writer.readIndexerProperties();
+    persistedProperties = propWriter.readIndexerProperties();
+    
+    String writerClassStr = null;
+    if(reqParams!=null && reqParams.requestParams != null) {
+    	writerClassStr = (String) reqParams.requestParams.get(PARAM_WRITER_IMPL);
+    }
+    if(writerClassStr != null && !writerClassStr.equals(DEFAULT_WRITER_NAME) && !writerClassStr.equals(DocBuilder.class.getPackage().getName() + "." + DEFAULT_WRITER_NAME)) {
+    	try {
+    		Class<DIHWriter> writerClass = loadClass(writerClassStr, dataImporter.getCore());
+    		this.writer = writerClass.newInstance();
+    	} catch (Exception e) {
+    		throw new DataImportHandlerException(DataImportHandlerException.SEVERE, "Unable to load Writer implementation:" + writerClassStr, e);
+    	}
+   	} else {
+    	writer = solrWriter;
+    }
+    ContextImpl ctx = new ContextImpl(null, null, null, null, reqParams.requestParams, null, this);
+    writer.init(ctx);
+  }
+
+
+
+
+  DebugLogger getDebugLogger(){
+    if (debugLogger == null) {
+      debugLogger = new DebugLogger();
+    }
+    return debugLogger;
   }
 
   public VariableResolverImpl getVariableResolver() {
     try {
       VariableResolverImpl resolver = null;
-      if(dataImporter != null && dataImporter.getCore() != null){
+      if(dataImporter != null && dataImporter.getCore() != null
+          && dataImporter.getCore().getResourceLoader().getCoreProperties() != null){
         resolver =  new VariableResolverImpl(dataImporter.getCore().getResourceLoader().getCoreProperties());
       } else resolver = new VariableResolverImpl();
       Map<String, Object> indexerNamespace = new HashMap<String, Object>();
@@ -135,94 +171,103 @@ public class DocBuilder {
 
   @SuppressWarnings("unchecked")
   public void execute() {
-    dataImporter.store(DataImporter.STATUS_MSGS, statusMessages);
-    document = dataImporter.getConfig().document;
-    final AtomicLong startTime = new AtomicLong(System.currentTimeMillis());
-    statusMessages.put(TIME_ELAPSED, new Object() {
-      @Override
-      public String toString() {
-        return getTimeElapsedSince(startTime.get());
-      }
-    });
-
-    statusMessages.put(DataImporter.MSG.TOTAL_QUERIES_EXECUTED,
-            importStatistics.queryCount);
-    statusMessages.put(DataImporter.MSG.TOTAL_ROWS_EXECUTED,
-            importStatistics.rowsCount);
-    statusMessages.put(DataImporter.MSG.TOTAL_DOC_PROCESSED,
-            importStatistics.docCount);
-    statusMessages.put(DataImporter.MSG.TOTAL_DOCS_SKIPPED,
-            importStatistics.skipDocCount);
-
-    List<String> entities = requestParameters.entities;
-
-    // Trigger onImportStart
-    if (document.onImportStart != null) {
-      invokeEventListener(document.onImportStart);
-    }
-    AtomicBoolean fullCleanDone = new AtomicBoolean(false);
-    //we must not do a delete of *:* multiple times if there are multiple root entities to be run
-    Properties lastIndexTimeProps = new Properties();
-    lastIndexTimeProps.setProperty(LAST_INDEX_KEY,
-            DataImporter.DATE_TIME_FORMAT.get().format(dataImporter.getIndexStartTime()));
-    for (DataConfig.Entity e : document.entities) {
-      if (entities != null && !entities.contains(e.name))
-        continue;
-      lastIndexTimeProps.setProperty(e.name + "." + LAST_INDEX_KEY,
-              DataImporter.DATE_TIME_FORMAT.get().format(new Date()));
-      root = e;
-      String delQuery = e.allAttributes.get("preImportDeleteQuery");
-      if (dataImporter.getStatus() == DataImporter.Status.RUNNING_DELTA_DUMP) {
-        cleanByQuery(delQuery, fullCleanDone);
-        doDelta();
-        delQuery = e.allAttributes.get("postImportDeleteQuery");
-        if (delQuery != null) {
-          fullCleanDone.set(false);
-          cleanByQuery(delQuery, fullCleanDone);
-        }
-      } else {
-        cleanByQuery(delQuery, fullCleanDone);
-        doFullDump();
-        delQuery = e.allAttributes.get("postImportDeleteQuery");
-        if (delQuery != null) {
-          fullCleanDone.set(false);
-          cleanByQuery(delQuery, fullCleanDone);
-        }
-      }
-      statusMessages.remove(DataImporter.MSG.TOTAL_DOC_PROCESSED);
-    }
-
-    if (stop.get()) {
-      // Dont commit if aborted using command=abort
-      statusMessages.put("Aborted", DataImporter.DATE_TIME_FORMAT.get().format(new Date()));
-      rollback();
-    } else {
-      // Do not commit unnecessarily if this is a delta-import and no documents were created or deleted
-      if (!requestParameters.clean) {
-        if (importStatistics.docCount.get() > 0 || importStatistics.deletedDocCount.get() > 0) {
-          finish(lastIndexTimeProps);
-        }
-      } else {
-        // Finished operation normally, commit now
-        finish(lastIndexTimeProps);
-      }
-      
-      if (writer != null) {
-        writer.finish();
-      }
-      
-      if (document.onImportEnd != null) {
-        invokeEventListener(document.onImportEnd);
-      }
-    }
-
-    statusMessages.remove(TIME_ELAPSED);
-    statusMessages.put(DataImporter.MSG.TOTAL_DOC_PROCESSED, ""+ importStatistics.docCount.get());
-    if(importStatistics.failedDocCount.get() > 0)
-      statusMessages.put(DataImporter.MSG.TOTAL_FAILED_DOCS, ""+ importStatistics.failedDocCount.get());
-
-    statusMessages.put("Time taken ", getTimeElapsedSince(startTime.get()));
-    LOG.info("Time taken = " + getTimeElapsedSince(startTime.get()));
+  	try {
+	    dataImporter.store(DataImporter.STATUS_MSGS, statusMessages);
+	    document = dataImporter.getConfig().document;
+	    final AtomicLong startTime = new AtomicLong(System.currentTimeMillis());
+	    statusMessages.put(TIME_ELAPSED, new Object() {
+	      @Override
+	      public String toString() {
+	        return getTimeElapsedSince(startTime.get());
+	      }
+	    });
+	
+	    statusMessages.put(DataImporter.MSG.TOTAL_QUERIES_EXECUTED,
+	            importStatistics.queryCount);
+	    statusMessages.put(DataImporter.MSG.TOTAL_ROWS_EXECUTED,
+	            importStatistics.rowsCount);
+	    statusMessages.put(DataImporter.MSG.TOTAL_DOC_PROCESSED,
+	            importStatistics.docCount);
+	    statusMessages.put(DataImporter.MSG.TOTAL_DOCS_SKIPPED,
+	            importStatistics.skipDocCount);
+	
+	    List<String> entities = requestParameters.entities;
+	
+	    // Trigger onImportStart
+	    if (document.onImportStart != null) {
+	      invokeEventListener(document.onImportStart);
+	    }
+	    AtomicBoolean fullCleanDone = new AtomicBoolean(false);
+	    //we must not do a delete of *:* multiple times if there are multiple root entities to be run
+	    Properties lastIndexTimeProps = new Properties();
+	    lastIndexTimeProps.setProperty(LAST_INDEX_KEY,
+	            DataImporter.DATE_TIME_FORMAT.get().format(dataImporter.getIndexStartTime()));
+	    for (DataConfig.Entity e : document.entities) {
+	      if (entities != null && !entities.contains(e.name))
+	        continue;
+	      lastIndexTimeProps.setProperty(e.name + "." + LAST_INDEX_KEY,
+	              DataImporter.DATE_TIME_FORMAT.get().format(new Date()));
+	      root = e;
+	      String delQuery = e.allAttributes.get("preImportDeleteQuery");
+	      if (dataImporter.getStatus() == DataImporter.Status.RUNNING_DELTA_DUMP) {
+	        cleanByQuery(delQuery, fullCleanDone);
+	        doDelta();
+	        delQuery = e.allAttributes.get("postImportDeleteQuery");
+	        if (delQuery != null) {
+	          fullCleanDone.set(false);
+	          cleanByQuery(delQuery, fullCleanDone);
+	        }
+	      } else {
+	        cleanByQuery(delQuery, fullCleanDone);
+	        doFullDump();
+	        delQuery = e.allAttributes.get("postImportDeleteQuery");
+	        if (delQuery != null) {
+	          fullCleanDone.set(false);
+	          cleanByQuery(delQuery, fullCleanDone);
+	        }
+	      }
+	      statusMessages.remove(DataImporter.MSG.TOTAL_DOC_PROCESSED);
+	    }
+	
+	    if (stop.get()) {
+	      // Dont commit if aborted using command=abort
+	      statusMessages.put("Aborted", DataImporter.DATE_TIME_FORMAT.get().format(new Date()));
+	      rollback();
+	    } else {
+	      // Do not commit unnecessarily if this is a delta-import and no documents were created or deleted
+	      if (!requestParameters.clean) {
+	        if (importStatistics.docCount.get() > 0 || importStatistics.deletedDocCount.get() > 0) {
+	          finish(lastIndexTimeProps);
+	        }
+	      } else {
+	        // Finished operation normally, commit now
+	        finish(lastIndexTimeProps);
+	      } 
+	      
+	      if (document.onImportEnd != null) {
+	        invokeEventListener(document.onImportEnd);
+	      }
+	    }
+	
+	    statusMessages.remove(TIME_ELAPSED);
+	    statusMessages.put(DataImporter.MSG.TOTAL_DOC_PROCESSED, ""+ importStatistics.docCount.get());
+	    if(importStatistics.failedDocCount.get() > 0)
+	      statusMessages.put(DataImporter.MSG.TOTAL_FAILED_DOCS, ""+ importStatistics.failedDocCount.get());
+	
+	    statusMessages.put("Time taken ", getTimeElapsedSince(startTime.get()));
+	    LOG.info("Time taken = " + getTimeElapsedSince(startTime.get()));
+	  } catch(Exception e)
+		{
+			throw new RuntimeException(e);
+		} finally
+		{
+			if (writer != null) {
+	      writer.close();
+	    }
+			if(requestParameters.debug) {
+				requestParameters.debugVerboseOutput = getDebugLogger().output;	
+			}
+		}
   }
 
   @SuppressWarnings("unchecked")
@@ -238,7 +283,7 @@ public class DocBuilder {
         addStatusMessage("Optimized");
     }
     try {
-      writer.persist(lastIndexTimeProps);
+      propWriter.persist(lastIndexTimeProps);
     } catch (Exception e) {
       LOG.error("Could not write property file", e);
       statusMessages.put("error", "Could not write property file. Delta imports will not work. " +
@@ -433,11 +478,11 @@ public class DocBuilder {
     private void runAThread(ThreadedEntityProcessorWrapper epw, EntityRow rows, String currProcess) throws Exception {
       currentEntityProcWrapper.set(epw);
       epw.threadedInit(context);
-      initEntity();
       try {
+        Context.CURRENT_CONTEXT.set(context);
         epw.init(rows);
+        initEntity();
         DocWrapper docWrapper = this.docWrapper;
-        Context.CURRENT_CONTEXT.set(context);
         for (; ;) {
           if(DocBuilder.this.stop.get()) break;
           try {
@@ -474,6 +519,9 @@ public class DocBuilder {
                   LOG.debug("adding a doc "+docWrapper);
                 }
                 boolean result = writer.upload(docWrapper);
+                if(reqParams.debug) {
+                	reqParams.debugDocuments.add(docWrapper);
+                }
                 docWrapper = null;
                 if (result){
                   importStatistics.docCount.incrementAndGet();
@@ -556,11 +604,11 @@ public class DocBuilder {
     Context.CURRENT_CONTEXT.set(ctx);
     
     if (requestParameters.start > 0) {
-      writer.log(SolrWriter.DISABLE_LOGGING, null, null);
+      getDebugLogger().log(DIHLogLevels.DISABLE_LOGGING, null, null);
     }
 
     if (verboseDebug) {
-      writer.log(SolrWriter.START_ENTITY, entity.name, null);
+      getDebugLogger().log(DIHLogLevels.START_ENTITY, entity.name, null);
     }
 
     int seenDocCount = 0;
@@ -574,11 +622,11 @@ public class DocBuilder {
           seenDocCount++;
 
           if (seenDocCount > requestParameters.start) {
-            writer.log(SolrWriter.ENABLE_LOGGING, null, null);
+            getDebugLogger().log(DIHLogLevels.ENABLE_LOGGING, null, null);
           }
 
           if (verboseDebug && entity.isDocRoot) {
-            writer.log(SolrWriter.START_DOC, entity.name, null);
+            getDebugLogger().log(DIHLogLevels.START_DOC, entity.name, null);
           }
           if (doc == null && entity.isDocRoot) {
             doc = new DocWrapper();
@@ -607,7 +655,7 @@ public class DocBuilder {
           }
 
           if (verboseDebug) {
-            writer.log(SolrWriter.ENTITY_OUT, entity.name, arow);
+            getDebugLogger().log(DIHLogLevels.ENTITY_OUT, entity.name, arow);
           }
           importStatistics.rowsCount.incrementAndGet();
           if (doc != null) {
@@ -632,6 +680,9 @@ public class DocBuilder {
               return;
             if (!doc.isEmpty()) {
               boolean result = writer.upload(doc);
+              if(reqParams.debug) {
+              	reqParams.debugDocuments.add(doc);
+              }
               doc = null;
               if (result){
                 importStatistics.docCount.incrementAndGet();
@@ -643,7 +694,7 @@ public class DocBuilder {
 
         } catch (DataImportHandlerException e) {
           if (verboseDebug) {
-            writer.log(SolrWriter.ENTITY_EXCEPTION, entity.name, e);
+            getDebugLogger().log(DIHLogLevels.ENTITY_EXCEPTION, entity.name, e);
           }
           if(e.getErrCode() == DataImportHandlerException.SKIP_ROW){
             continue;
@@ -662,21 +713,21 @@ public class DocBuilder {
             throw e;
         } catch (Throwable t) {
           if (verboseDebug) {
-            writer.log(SolrWriter.ENTITY_EXCEPTION, entity.name, t);
+            getDebugLogger().log(DIHLogLevels.ENTITY_EXCEPTION, entity.name, t);
           }
           throw new DataImportHandlerException(DataImportHandlerException.SEVERE, t);
         } finally {
           if (verboseDebug) {
-            writer.log(SolrWriter.ROW_END, entity.name, null);
+            getDebugLogger().log(DIHLogLevels.ROW_END, entity.name, null);
             if (entity.isDocRoot)
-              writer.log(SolrWriter.END_DOC, null, null);
+              getDebugLogger().log(DIHLogLevels.END_DOC, null, null);
             Context.CURRENT_CONTEXT.remove();
           }
         }
       }
     } finally {
       if (verboseDebug) {
-        writer.log(SolrWriter.END_ENTITY, null, null);
+        getDebugLogger().log(DIHLogLevels.END_ENTITY, null, null);
       }
       entityProcessor.destroy();
     }

Modified: lucene/dev/branches/flexscoring/solr/contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EntityProcessorWrapper.java
URL: http://svn.apache.org/viewvc/lucene/dev/branches/flexscoring/solr/contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EntityProcessorWrapper.java?rev=1160700&r1=1160699&r2=1160700&view=diff
==============================================================================
--- lucene/dev/branches/flexscoring/solr/contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EntityProcessorWrapper.java (original)
+++ lucene/dev/branches/flexscoring/solr/contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EntityProcessorWrapper.java Tue Aug 23 14:06:58 2011
@@ -83,7 +83,7 @@ public class EntityProcessorWrapper exte
       @Override
       public boolean add(Transformer transformer) {
         if (docBuilder != null && docBuilder.verboseDebug) {
-          transformer = docBuilder.writer.getDebugLogger().wrapTransformer(transformer);
+          transformer = docBuilder.getDebugLogger().wrapTransformer(transformer);
         }
         return super.add(transformer);
       }

Modified: lucene/dev/branches/flexscoring/solr/contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EvaluatorBag.java
URL: http://svn.apache.org/viewvc/lucene/dev/branches/flexscoring/solr/contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EvaluatorBag.java?rev=1160700&r1=1160699&r2=1160700&view=diff
==============================================================================
--- lucene/dev/branches/flexscoring/solr/contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EvaluatorBag.java (original)
+++ lucene/dev/branches/flexscoring/solr/contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/EvaluatorBag.java Tue Aug 23 14:06:58 2011
@@ -217,7 +217,6 @@ public class EvaluatorBag {
         Evaluator evaluator = evaluators.get(fname);
         if (evaluator == null)
           return null;
-        VariableResolverImpl vri = VariableResolverImpl.CURRENT_VARIABLE_RESOLVER.get();
         return evaluator.evaluate(m.group(2), Context.CURRENT_CONTEXT.get());
       }
 



Mime
View raw message