I modified storage-config.xml to add a new table and couple column families (see excerpt below).  The new table added is identified by the name "NewTable" and associated column families "Standard3", "Super3", and "Super4".

    <!-- Tables and ColumnFamilies                                            -->
        <Table Name="Table1">
            <!-- if FlushPeriodInMinutes is configured and positive, it will be
                 flushed to disk with that period whether it is dirty or not.
                 This is intended for lightly-used columnfamilies so that they
                 do not prevent commitlog segments from being purged. -->
            <ColumnFamily ColumnSort="Name" Name="Standard1" FlushPeriodInMinutes="60"/>
            <ColumnFamily ColumnSort="Name" Name="Standard2"/>
            <ColumnFamily ColumnSort="Time" Name="StandardByTime1"/>
            <ColumnFamily ColumnSort="Time" Name="StandardByTime2"/>
            <ColumnFamily ColumnType="Super" ColumnSort="Name" Name="Super1"/>
            <ColumnFamily ColumnType="Super" ColumnSort="Name" Name="Super2"/>
        <Table Name="NewTable">     
            <ColumnFamily ColumnSort="Name" Name="Standard3"/>
            <ColumnFamily ColumnType="Super" ColumnSort="Name" Name="Super3"/>
            <ColumnFamily ColumnType="Super" ColumnSort="Name" Name="Super4"/>


Here comes some code to insert some data, the goal is to feed Cassandra with data from an xml file.
When I execute the code, I got an exception.  What I don't understand is why this code failed even I have configured the super column families and new table etc.

InvalidRequestException(why:Column Family Super3 is invalid.)
    at org.apache.cassandra.service.Cassandra$get_column_result.read(Cassandra.java:3604)
    at org.apache.cassandra.service.Cassandra$Client.recv_get_column(Cassandra.java:202)
    at org.apache.cassandra.service.Cassandra$Client.get_column(Cassandra.java:178)

        // New Table Sample
        String docID = "";
        try {
            batch_mutation_super_t bt = new batch_mutation_super_t();
            bt.table = "NewTable";
            bt.cfmap = new HashMap<String,List<superColumn_t>>();
            // Read sample xml
            XMLUtils xmlUtils = new XMLUtils(
                + System.getProperty("file.separator")
                + "Sample.xml");
            /* docID from xml file */
            doctID = xmlUtils.getNodeValue("/Document/docID");
            bt.key = docID;
            // Collect all nodes that matches /Document/node1
            NodeList nl = xmlUtils.getRequestedNodeList("/Document/node1");
            StringWriter sw = new StringWriter();
            Transformer t = TransformerFactory.newInstance().newTransformer();
            t.setOutputProperty(OutputKeys.OMIT_XML_DECLARATION, "yes");
            t.transform(new DOMSource(nl.item(0)), new StreamResult(sw));
            /* nodes */
            // see populate function below
            List<column_t> nodes_arr = populate("node", "/Document/node1", xmlUtils, t);
            List<superColumn_t> S3 = new ArrayList<superColumn_t>();
            S3.add(new superColumn_t("sc1", nodes_arr));
            bt.cfmap.put("Super3", S3);
            List<superColumn_t> S4= new ArrayList<superColumn_t>();

            S4.add(new superColumn_t("sc1_replicate", nodes_arr));
            bt.cfmap.put("Super4", S4);
            peerstorageClient.batch_insert_superColumn(bt, false);
        } catch (Exception e) {

    // Returns columns of XML data matching xpath on given xml doc (via xmlUtlis)
    private static List<column_t> populate(String column_prefix, String xpath, XMLUtils xmlUtils, Transformer t) throws Exception {
        StringWriter sw = new StringWriter();
        List<column_t> c = new ArrayList<column_t>();
        NodeList nl = xmlUtils.getRequestedNodeList(xpath);
        long now = Calendar.getInstance().getTimeInMillis();
        if (nl != null) {
            for (int i = 0; i < nl.getLength(); i++) {
                sw = new StringWriter();
                t.transform(new DOMSource(nl.item(i)), new StreamResult(sw));
                c.add(new column_t(column_prefix+i, sw.toString().getBytes(), now));
        return c;

Thanks for checking this issue out.