You'd be better off submitting this question to the solr-users mailing list - that's where the Solr experts hang out. -Simon On Thu, Sep 1, 2011 at 5:43 PM, angel wrote: > Hi below is my java program for indexing around 30million records from > csv.But this doen't work for such a large file.It works perfectly fro > smaller files.What's the wrong with my code.please let me know > > try{ > /*SolrServer server = new > CommonsHttpSolrServer("http://localhost:8070/solr"); > > > //Start by deleting everything > > ContentStreamUpdateRequest req = new > ContentStreamUpdateRequest("/update/csv"); > > req.addFile(new File("/samplefile_25millionrecords.csv")); > req.setParam("commit", "true"); > req.setParam("stream.contentType", > "text/plain;charset=utf-8"); > req.setParam("separator", "|"); > req.setParam("skipLines", "0"); > //req.setParam("fieldnames","ewyyewu,etuiewtu");//30 or more > fielsd > > > NamedList result = server.request(req); > > > System.out.println("Result====================================================================================: > \n" + result); > > }catch(IOException e){ > e.printStackTrace(); > } catch (SolrServerException e) { > // TODO Auto-generated catch block > e.printStackTrace(); > }*/ > > -- > View this message in context: > http://lucene.472066.n3.nabble.com/indexing-30million-records-to-Solr-using-solrj-doen-t-work-but-works-for-small-files-tp3302667p3302667.html > Sent from the Lucene - General mailing list archive at Nabble.com. >