camel-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From acosent...@apache.org
Subject [1/2] camel git commit: Added camel-spark-rest docs to Gitbook
Date Tue, 07 Jun 2016 09:07:02 GMT
Repository: camel
Updated Branches:
  refs/heads/master 00f72a29d -> b227c607c


Added camel-spark-rest docs to Gitbook


Project: http://git-wip-us.apache.org/repos/asf/camel/repo
Commit: http://git-wip-us.apache.org/repos/asf/camel/commit/df5a8e02
Tree: http://git-wip-us.apache.org/repos/asf/camel/tree/df5a8e02
Diff: http://git-wip-us.apache.org/repos/asf/camel/diff/df5a8e02

Branch: refs/heads/master
Commit: df5a8e021939cc8f745d1688baa539c481950332
Parents: 00f72a2
Author: Andrea Cosentino <ancosen@gmail.com>
Authored: Tue Jun 7 10:53:19 2016 +0200
Committer: Andrea Cosentino <ancosen@gmail.com>
Committed: Tue Jun 7 10:53:19 2016 +0200

----------------------------------------------------------------------
 .../src/main/docs/spark-rest.adoc               | 193 +++++++++++++++++++
 docs/user-manual/en/SUMMARY.md                  |   1 +
 2 files changed, 194 insertions(+)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/camel/blob/df5a8e02/components/camel-spark-rest/src/main/docs/spark-rest.adoc
----------------------------------------------------------------------
diff --git a/components/camel-spark-rest/src/main/docs/spark-rest.adoc b/components/camel-spark-rest/src/main/docs/spark-rest.adoc
new file mode 100644
index 0000000..53d9125
--- /dev/null
+++ b/components/camel-spark-rest/src/main/docs/spark-rest.adoc
@@ -0,0 +1,193 @@
+[[Spark-rest-Spark-restComponent]]
+Spark-rest Component
+~~~~~~~~~~~~~~~~~~~~
+
+*Available as of Camel 2.14*
+
+The Spark-rest component allows to define REST endpoints using the
+http://sparkjava.com/[Spark REST Java library] using the
+link:rest-dsl.html[Rest DSL].
+
+INFO: Spark Java requires Java 8 runtime.
+
+Maven users will need to add the following dependency to their pom.xml
+for this component:
+
+[source,xml]
+-------------------------------------------------
+ <dependency>
+        <groupId>org.apache.camel</groupId>
+        <artifactId>camel-spark-rest</artifactId>
+        <version>${camel-version}</version>
+    </dependency>
+-------------------------------------------------
+
+[[Spark-rest-URIformat]]
+URI format
+^^^^^^^^^^
+
+[source,java]
+----------------------------------
+  spark-rest://verb:path?[options]
+----------------------------------
+
+[[Spark-rest-URIOptions]]
+URI Options
+^^^^^^^^^^^
+
+
+// component options: START
+The Spark Rest component supports 11 options which are listed below.
+
+
+
+{% raw %}
+[width="100%",cols="2s,1m,8",options="header"]
+|=======================================================================
+| Name | Java Type | Description
+| port | int | Port number. Will by default use 4567
+| ipAddress | String | Set the IP address that Spark should listen on. If not called the
default address is '0.0.0.0'.
+| minThreads | int | Minimum number of threads in Spark thread-pool (shared globally)
+| maxThreads | int | Maximum number of threads in Spark thread-pool (shared globally)
+| timeOutMillis | int | Thread idle timeout in millis where threads that has been idle for
a longer period will be terminated from the thread pool
+| keystoreFile | String | Configures connection to be secure to use the keystore file
+| keystorePassword | String | Configures connection to be secure to use the keystore password
+| truststoreFile | String | Configures connection to be secure to use the truststore file
+| truststorePassword | String | Configures connection to be secure to use the truststore
password
+| sparkConfiguration | SparkConfiguration | To use the shared SparkConfiguration
+| sparkBinding | SparkBinding | To use a custom SparkBinding to map to/from Camel message.
+|=======================================================================
+{% endraw %}
+// component options: END
+
+
+
+// endpoint options: START
+The Spark Rest component supports 13 endpoint options which are listed below:
+
+{% raw %}
+[width="100%",cols="2s,1,1m,1m,5",options="header"]
+|=======================================================================
+| Name | Group | Default | Java Type | Description
+| verb | consumer |  | String | *Required* get post put patch delete head trace connect or
options.
+| path | consumer |  | String | *Required* The content path which support Spark syntax.
+| accept | consumer |  | String | Accept type such as: 'text/xml' or 'application/json'.
By default we accept all kinds of types.
+| bridgeErrorHandler | consumer | false | boolean | Allows for bridging the consumer to the
Camel routing Error Handler which mean any exceptions occurred while the consumer is trying
to pickup incoming messages or the likes will now be processed as a message and handled by
the routing Error Handler. By default the consumer will use the org.apache.camel.spi.ExceptionHandler
to deal with exceptions that will be logged at WARN/ERROR level and ignored.
+| disableStreamCache | consumer | false | boolean | Determines whether or not the raw input
stream from Spark HttpRequestgetContent() is cached or not (Camel will read the stream into
a in light-weight memory based Stream caching) cache. By default Camel will cache the Netty
input stream to support reading it multiple times to ensure Camel can retrieve all data from
the stream. However you can set this option to true when you for example need to access the
raw stream such as streaming it directly to a file or other persistent store. Mind that if
you enable this option then you cannot read the Netty stream multiple times out of the box
and you would need manually to reset the reader index on the Spark raw stream.
+| mapHeaders | consumer | true | boolean | If this option is enabled then during binding
from Spark to Camel Message then the headers will be mapped as well (eg added as header to
the Camel Message as well). You can turn off this option to disable this. The headers can
still be accessed from the org.apache.camel.component.sparkrest.SparkMessage message with
the method getRequest() that returns the Spark HTTP request instance.
+| transferException | consumer | false | boolean | If enabled and an Exchange failed processing
on the consumer side and if the caused Exception was send back serialized in the response
as a application/x-java-serialized-object content type. This is by default turned off. If
you enable this then be aware that Java will deserialize the incoming data from the request
to Java and that can be a potential security risk.
+| urlDecodeHeaders | consumer | false | boolean | If this option is enabled then during binding
from Spark to Camel Message then the header values will be URL decoded (eg 20 will be a space
character.)
+| exceptionHandler | consumer (advanced) |  | ExceptionHandler | To let the consumer use
a custom ExceptionHandler. Notice if the option bridgeErrorHandler is enabled then this options
is not in use. By default the consumer will deal with exceptions that will be logged at WARN/ERROR
level and ignored.
+| exchangePattern | advanced | InOnly | ExchangePattern | Sets the default exchange pattern
when creating an exchange
+| matchOnUriPrefix | advanced | false | boolean | Whether or not the consumer should try
to find a target consumer by matching the URI prefix if no exact match is found.
+| sparkBinding | advanced |  | SparkBinding | To use a custom SparkBinding to map to/from
Camel message.
+| synchronous | advanced | false | boolean | Sets whether synchronous processing should be
strictly used or Camel is allowed to use asynchronous processing (if supported).
+|=======================================================================
+{% endraw %}
+// endpoint options: END
+
+
+[[Spark-rest-PathusingSparksyntax]]
+Path using Spark syntax
+^^^^^^^^^^^^^^^^^^^^^^^
+
+The path option is defined using a Spark REST syntax where you define
+the REST context path using support for parameters and splat. See more
+details at the http://sparkjava.com/readme.html#title1[Spark Java Route]
+documentation.
+
+The following is a Camel route using a fixed path
+
+[source,java]
+---------------------------------------
+  from("spark-rest:get:hello")
+    .transform().constant("Bye World");
+---------------------------------------
+
+And the following route uses a parameter which is mapped to a Camel
+header with the key "me".
+
+[source,java]
+--------------------------------------------
+  from("spark-rest:get:hello/:me")
+    .transform().simple("Bye ${header.me}");
+--------------------------------------------
+
+[[Spark-rest-MappingtoCamelMessage]]
+Mapping to Camel Message
+^^^^^^^^^^^^^^^^^^^^^^^^
+
+The Spark Request object is mapped to a Camel Message as
+a `org.apache.camel.component.sparkrest.SparkMessage` which has access
+to the raw Spark request using the getRequest method. By default the
+Spark body is mapped to Camel message body, and any HTTP headers / Spark
+parameters is mapped to Camel Message headers. There is special support
+for the Spark splat syntax, which is mapped to the Camel message header
+with key splat.
+
+For example the given route below uses Spark splat (the asterisk
+sign) in the context path which we can access as a header form the
+Simple language to construct a response message.
+
+[source,java]
+------------------------------------------------------------------------------
+  from("spark-rest:get:/hello/*/to/*")
+    .transform().simple("Bye big ${header.splat[1]} from ${header.splat[0]}");
+------------------------------------------------------------------------------
+
+[[Spark-rest-RestDSL]]
+Rest DSL
+^^^^^^^^
+
+Apache Camel provides a new Rest DSL that allow to define the REST
+services in a nice REST style. For example we can define a REST hello
+service as shown below:
+
+[source,java]
+----------------------------------------------------------------
+  return new RouteBuilder() {
+      @Override
+      public void configure() throws Exception {
+            rest("/hello/{me}").get()
+                .route().transform().simple("Bye ${header.me}");
+        }
+    };
+----------------------------------------------------------------
+
+[source,xml]
+--------------------------------------------------------------
+  <camelContext xmlns="http://camel.apache.org/schema/spring">
+    <rest uri="/hello/{me}">
+      <get>
+        <route>
+          <transform>
+            <simple>Bye ${header.me}</simple>
+          </transform>
+        </route>
+      </get>
+    </rest>
+  </camelContext>
+--------------------------------------------------------------
+
+See more details at the link:rest-dsl.html[Rest DSL].
+
+[[Spark-rest-Moreexamples]]
+More examples
+^^^^^^^^^^^^^
+
+There is a *camel-example-spark-rest-tomcat* example in the Apache Camel
+distribution, that demonstrates how to use camel-spark-rest in a web
+application that can be deployed on Apache Tomcat, or similar web
+containers.
+
+[[Spark-rest-SeeAlso]]
+See Also
+^^^^^^^^
+
+* link:configuring-camel.html[Configuring Camel]
+* link:component.html[Component]
+* link:endpoint.html[Endpoint]
+* link:getting-started.html[Getting Started]
+
+* link:rest.html[Rest]
+

http://git-wip-us.apache.org/repos/asf/camel/blob/df5a8e02/docs/user-manual/en/SUMMARY.md
----------------------------------------------------------------------
diff --git a/docs/user-manual/en/SUMMARY.md b/docs/user-manual/en/SUMMARY.md
index 7c52fbd..dd955d4 100644
--- a/docs/user-manual/en/SUMMARY.md
+++ b/docs/user-manual/en/SUMMARY.md
@@ -243,6 +243,7 @@
     * [SNMP](snmp.adoc)
     * [Solr](solr.adoc)
     * [Spark](spark.adoc)
+    * [Spark-Rest](spark-rest.adoc)
     * [Telegram](telegram.adoc)
     * [Twitter](twitter.adoc)
     * [Websocket](websocket.adoc)


Mime
View raw message