incubator-jspwiki-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Janne Jalkanen (JIRA)" <j...@apache.org>
Subject [jira] Resolved: (JSPWIKI-527) Parsing of WikiDocument cuts too long lines/paragraphs
Date Tue, 28 Apr 2009 14:40:30 GMT

     [ https://issues.apache.org/jira/browse/JSPWIKI-527?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Janne Jalkanen resolved JSPWIKI-527.
------------------------------------

    Resolution: Won't Fix

This is a feature, not a bug. Lines which are too long may cause memory management problems,
so we filter out any really large lines.

A simple workaround is that you can emit a newline in suitable points in your HTML.

> Parsing of WikiDocument cuts too long lines/paragraphs
> ------------------------------------------------------
>
>                 Key: JSPWIKI-527
>                 URL: https://issues.apache.org/jira/browse/JSPWIKI-527
>             Project: JSPWiki
>          Issue Type: Bug
>          Components: Core & storage
>    Affects Versions: 2.8.1
>         Environment: Win  XP & LInux
>            Reporter: Jochen Reutelshoefer
>
> After the filters are run the WikiDocument is re-parsed if the filter changed some data.
If there is some too long content without linebreaks the paragraph gets cut. (> 10000 characters)
> I have a pageFilter that creates (longer) HTML-output. If I dont put in any linebreaks,
it gets broken/trimmed always to same length (destroying HTML-Structure of page)..
> Might have something to do with the alorithm parsing that tree of Content objects and/or
the used data-structures.

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.


Mime
View raw message