zeppelin-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From k..@apache.org
Subject [2/2] zeppelin git commit: [ZEPPELIN-2403] interpreter property widgets
Date Thu, 06 Jul 2017 06:55:01 GMT
[ZEPPELIN-2403] interpreter property widgets

### What is this PR for?
I spoiled the previous PR #2251

Added widgets (string, text, url, password, url, checkbox) to properties of interpreters. Those are widgets for properties customization. Properties must have the ability to customize the display (for example password).

### What type of PR is it?
Feature

### What is the Jira issue?
https://issues.apache.org/jira/browse/ZEPPELIN-2403

### How should this be tested?
- remove conf/interpreter.json
- Try new form (create, edit) of interpreter settings

### Screenshots (if appropriate)
edit
![edit](https://cloud.githubusercontent.com/assets/25951039/25130228/e2a28060-245a-11e7-895a-d7c1571f885f.png)

view
![view](https://cloud.githubusercontent.com/assets/25951039/25130227/e2a10906-245a-11e7-9ea3-0bd070219f42.png)

### Questions:
* Does the licenses files need update? no
* Is there breaking changes for older versions? no
* Does this needs documentation? no

Author: Tinkoff DWH <tinkoff.dwh@gmail.com>
Author: isys.mreshetov <m.reshetov@i-sys.ru>

Closes #2268 from tinkoff-dwh/ZEPPELIN-2403 and squashes the following commits:

75a10464 [isys.mreshetov] ZEPPELIN-2403 imports fix
7be8ddff [isys.mreshetov] Merge remote-tracking branch 'upstream/master' into ZEPPELIN-2403
585fc364 [isys.mreshetov] ZEPPELIN-2403 documentation fix
4b633993 [isys.mreshetov] Merge remote-tracking branch 'upstream/master' into ZEPPELIN-2403
726c1f31 [isys.mreshetov] Merge remote-tracking branch 'upstream/master' into ZEPPELIN-2403
b17dfb59 [isys.mreshetov] Merge remote-tracking branch 'upstream/master' into ZEPPELIN-2403
098fbd14 [Tinkoff DWH] Merge remote-tracking branch 'upstream/master' into ZEPPELIN-2403
a5f13272 [Tinkoff DWH] [ZEPPELIN-2403] checkstyle fix
fd25c467 [Tinkoff DWH] Merge remote-tracking branch 'upstream/master' into ZEPPELIN-2403
e35ff58f [Tinkoff DWH] [ZEPPELIN-2403] fix checkstyle
7c25b6db [Tinkoff DWH] Merge remote-tracking branch 'upstream/master' into ZEPPELIN-2403
10ce996a [Tinkoff DWH] [ZEPPELIN-2403] merge widget and type
ca1e2bf7 [Tinkoff DWH] Merge remote-tracking branch 'upstream/master' into ZEPPELIN-2403
99daca6d [Tinkoff DWH] [ZEPPELIN-2403] fix rest api test
f735c0a9 [Tinkoff DWH] [ZEPPELIN-2403] fix test
c6d24c4c [Tinkoff DWH] [ZEPPELIN-2403] converter for old settings to new (with widgets)
76a98083 [Tinkoff DWH] Merge remote-tracking branch 'origin/master' into ZEPPELIN-2403
b41e7a3f [Tinkoff DWH] ZEPPELIN-2403 checkstyle
637cb0a1 [Tinkoff DWH] Merge remote-tracking branch 'upstream/master' into ZEPPELIN-2403
e92713c7 [Tinkoff DWH] [ZEPPELIN-2403] generalized types, added new types
07160e00 [Tinkoff DWH] Merge remote-tracking branch 'upstream/master' into ZEPPELIN-2403
a495137f [Tinkoff DWH] ZEPPELIN-2403 eslint fix
fd8d2781 [Tinkoff DWH] Merge remote-tracking branch 'origin/master' into ZEPPELIN-2403_backup
4f271d9b [Tinkoff DWH] ZEPPELIN-2403  rename to widget  added new widgets  string,  number,  url
dd5d6c80 [Tinkoff DWH] ZEPPELIN-2403 did properties immutable, added new type 'checkbox'
14353b12 [Tinkoff DWH] Merge remote-tracking branch 'upstream/master' into ZEPPELIN-2403
12499ae1 [Tinkoff DWH] Merge remote-tracking branch 'upstream/master' into ZEPPELIN-2403
45f5f627 [Tinkoff DWH] ZEPPELIN-2403 added interpreter property types


Project: http://git-wip-us.apache.org/repos/asf/zeppelin/repo
Commit: http://git-wip-us.apache.org/repos/asf/zeppelin/commit/155a55b5
Tree: http://git-wip-us.apache.org/repos/asf/zeppelin/tree/155a55b5
Diff: http://git-wip-us.apache.org/repos/asf/zeppelin/diff/155a55b5

Branch: refs/heads/master
Commit: 155a55b5609b9cdab9369b35b97218623dc1de8e
Parents: 2842251
Author: Tinkoff DWH <tinkoff.dwh@gmail.com>
Authored: Wed Jul 5 09:08:30 2017 +0500
Committer: 1ambda <1amb4a@gmail.com>
Committed: Thu Jul 6 15:54:55 2017 +0900

----------------------------------------------------------------------
 .../src/main/resources/interpreter-setting.json |   6 +-
 .../src/main/resources/interpreter-setting.json |   6 +-
 .../src/main/resources/interpreter-setting.json |   9 +-
 .../src/main/resources/interpreter-setting.json |  95 +++++++++-----
 .../development/writing_zeppelin_interpreter.md |   6 +-
 docs/usage/rest_api/interpreter.md              |  83 ++++++++++--
 .../src/main/resources/interpreter-setting.json |  21 ++-
 .../src/main/resources/interpreter-setting.json |   9 +-
 .../src/main/resources/interpreter-setting.json |   6 +-
 .../src/main/resources/interpreter-setting.json |   9 +-
 .../src/main/resources/interpreter-setting.json |   3 +-
 .../src/main/resources/interpreter-setting.json |  11 +-
 .../src/main/resources/interpreter-setting.json |   3 +-
 .../src/main/resources/interpreter-setting.json |  19 ++-
 .../src/main/resources/interpreter-setting.json |  38 ++++--
 .../src/main/resources/interpreter-setting.json |  23 ++--
 .../src/main/resources/interpreter-setting.json |  25 ++--
 .../src/main/resources/interpreter-setting.json |  68 ++++++----
 .../src/main/resources/interpreter-setting.json |   3 +-
 pig/src/main/resources/interpreter-setting.json |  17 ++-
 .../src/main/resources/interpreter-setting.json |   6 +-
 r/src/main/resources/interpreter-setting.json   |  24 ++--
 .../src/main/resources/interpreter-setting.json |   6 +-
 .../src/main/resources/interpreter-setting.json |  12 +-
 .../apache/zeppelin/spark/SparkInterpreter.java |  50 ++++++--
 .../src/main/resources/interpreter-setting.json |  63 +++++----
 .../sparkr-resources/interpreter-setting.json   |  77 ++++++-----
 .../interpreter/DefaultInterpreterProperty.java | 128 +++++++++++++++++++
 .../zeppelin/interpreter/Interpreter.java       |  19 +--
 .../interpreter/InterpreterProperty.java        |  87 ++++---------
 .../interpreter/InterpreterPropertyBuilder.java |   9 +-
 .../interpreter/InterpreterPropertyType.java    |  62 +++++++++
 .../zeppelin/rest/InterpreterRestApi.java       |  32 +++--
 .../message/NewInterpreterSettingRequest.java   |   5 +-
 .../UpdateInterpreterSettingRequest.java        |   9 +-
 .../zeppelin/integration/InterpreterIT.java     |   2 +-
 .../zeppelin/rest/AbstractTestRestApi.java      |  34 +++--
 .../zeppelin/rest/InterpreterRestApiTest.java   |  15 ++-
 zeppelin-web/package.json                       |   2 +-
 .../interpreter-create/interpreter-create.html  |  41 ++++--
 .../app/interpreter/interpreter.controller.js   |  49 +++++--
 .../src/app/interpreter/interpreter.html        |  41 +++++-
 .../widget/widget.number.directive.js           |  31 +++++
 zeppelin-web/src/index.js                       |   1 +
 .../interpreter/InterpreterFactory.java         |   2 +-
 .../interpreter/InterpreterSetting.java         |  50 +++++++-
 .../interpreter/InterpreterSettingManager.java  |  84 +++++++-----
 .../helium/HeliumApplicationFactoryTest.java    |   9 +-
 .../interpreter/InterpreterFactoryTest.java     | 107 ++++++++++------
 .../notebook/NoteInterpreterLoaderTest.java     |  12 +-
 .../apache/zeppelin/notebook/NotebookTest.java  |   8 +-
 .../notebook/repo/VFSNotebookRepoTest.java      |   5 +-
 52 files changed, 1088 insertions(+), 454 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/zeppelin/blob/155a55b5/alluxio/src/main/resources/interpreter-setting.json
----------------------------------------------------------------------
diff --git a/alluxio/src/main/resources/interpreter-setting.json b/alluxio/src/main/resources/interpreter-setting.json
index 8b082ab..b9ab898 100644
--- a/alluxio/src/main/resources/interpreter-setting.json
+++ b/alluxio/src/main/resources/interpreter-setting.json
@@ -8,13 +8,15 @@
         "envName": "ALLUXIO_MASTER_HOSTNAME",
         "propertyName": "alluxio.master.hostname",
         "defaultValue": "localhost",
-        "description": "Alluxio master hostname"
+        "description": "Alluxio master hostname",
+        "type": "string"
       },
       "alluxio.master.port": {
         "envName": "ALLUXIO_MASTER_PORT",
         "propertyName": "alluxio.master.port",
         "defaultValue": "19998",
-        "description": "Alluxio master port"
+        "description": "Alluxio master port",
+        "type": "number"
       }
     },
     "editor": {

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/155a55b5/beam/src/main/resources/interpreter-setting.json
----------------------------------------------------------------------
diff --git a/beam/src/main/resources/interpreter-setting.json b/beam/src/main/resources/interpreter-setting.json
index 428b76d..e9b4a73 100644
--- a/beam/src/main/resources/interpreter-setting.json
+++ b/beam/src/main/resources/interpreter-setting.json
@@ -19,13 +19,15 @@
         "envName": "ZEPPELIN_SCIO_ARGZ",
         "propertyName": "zeppelin.scio.argz",
         "defaultValue": "--runner=InProcessPipelineRunner",
-        "description": "Scio interpreter wide arguments"
+        "description": "Scio interpreter wide arguments",
+        "type": "textarea"
       },
       "zeppelin.scio.maxResult": {
         "envName": "ZEPPELIN_SCIO_MAXRESULT",
         "propertyName": "zeppelin.scio.maxResult",
         "defaultValue": "1000",
-        "description": "Max number of SCollection results to display."
+        "description": "Max number of SCollection results to display.",
+        "type": "number"
       }
     },
     "editor": {

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/155a55b5/bigquery/src/main/resources/interpreter-setting.json
----------------------------------------------------------------------
diff --git a/bigquery/src/main/resources/interpreter-setting.json b/bigquery/src/main/resources/interpreter-setting.json
index f782495..3e1f27a 100644
--- a/bigquery/src/main/resources/interpreter-setting.json
+++ b/bigquery/src/main/resources/interpreter-setting.json
@@ -8,13 +8,15 @@
         "envName": null,
         "propertyName": "zeppelin.bigquery.project_id",
         "defaultValue": " ",
-        "description": "Google Project ID"
+        "description": "Google Project ID",
+        "type": "string"
       },
       "zeppelin.bigquery.wait_time": {
         "envName": null,
         "propertyName": "zeppelin.bigquery.wait_time",
         "defaultValue": "5000",
-        "description": "Query timeout in Milliseconds"
+        "description": "Query timeout in Milliseconds",
+        "type": "number"
       },
       "zeppelin.bigquery.max_no_of_rows": {
         "envName": null,
@@ -26,7 +28,8 @@
         "envName": null,
         "propertyName": "zeppelin.bigquery.use_legacy_sql",
         "defaultValue": "true",
-        "description": "set true to use legacy sql"
+        "description": "set true to use legacy sql",
+        "type": "checkbox"
       }
     },
     "editor": {

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/155a55b5/cassandra/src/main/resources/interpreter-setting.json
----------------------------------------------------------------------
diff --git a/cassandra/src/main/resources/interpreter-setting.json b/cassandra/src/main/resources/interpreter-setting.json
index 3df120d..0b63585 100644
--- a/cassandra/src/main/resources/interpreter-setting.json
+++ b/cassandra/src/main/resources/interpreter-setting.json
@@ -8,187 +8,218 @@
         "envName": null,
         "propertyName": "cassandra.hosts",
         "defaultValue": "localhost",
-        "description": "Comma separated Cassandra hosts (DNS name or IP address). Default = localhost. Ex: '192.168.0.12,node2,node3'"
+        "description": "Comma separated Cassandra hosts (DNS name or IP address). Default = localhost. Ex: '192.168.0.12,node2,node3'",
+        "type": "textarea"
       },
       "cassandra.native.port": {
         "envName": null,
         "propertyName": "cassandra.native.port",
         "defaultValue": "9042",
-        "description": "Cassandra native port. Default = 9042"
+        "description": "Cassandra native port. Default = 9042",
+        "type": "number"
       },
       "cassandra.protocol.version": {
         "envName": null,
         "propertyName": "cassandra.protocol.version",
         "defaultValue": "4",
-        "description": "Cassandra protocol version. Default = 4"
+        "description": "Cassandra protocol version. Default = 4",
+        "type": "string"
       },
       "cassandra.cluster": {
         "envName": null,
         "propertyName": "cassandra.cluster",
         "defaultValue": "Test Cluster",
-        "description": "Cassandra cluster name. Default = 'Test Cluster'"
+        "description": "Cassandra cluster name. Default = 'Test Cluster'",
+        "type": "string"
       },
       "cassandra.keyspace": {
         "envName": null,
         "propertyName": "cassandra.keyspace",
         "defaultValue": "system",
-        "description": "Cassandra keyspace name. Default = 'system'"
+        "description": "Cassandra keyspace name. Default = 'system'",
+        "type": "string"
       },
       "cassandra.compression.protocol": {
         "envName": null,
         "propertyName": "cassandra.compression.protocol",
         "defaultValue": "NONE",
-        "description": "Cassandra compression protocol. Available values: NONE, SNAPPY, LZ4. Default = NONE"
+        "description": "Cassandra compression protocol. Available values: NONE, SNAPPY, LZ4. Default = NONE",
+        "type": "string"
       },
       "cassandra.credentials.username": {
         "envName": null,
         "propertyName": "cassandra.credentials.username",
         "defaultValue": "none",
-        "description": "Cassandra credentials username. Default = 'none'"
+        "description": "Cassandra credentials username. Default = 'none'",
+        "type": "string"
       },
       "cassandra.credentials.password": {
         "envName": null,
         "propertyName": "cassandra.credentials.password",
         "defaultValue": "none",
-        "description": "Cassandra credentials password. Default = 'none'"
+        "description": "Cassandra credentials password. Default = 'none'",
+        "type": "password"
       },
       "cassandra.load.balancing.policy": {
         "envName": null,
         "propertyName": "cassandra.load.balancing.policy",
         "defaultValue": "DEFAULT",
-        "description": "Cassandra Load Balancing Policy. Default = new TokenAwarePolicy(new DCAwareRoundRobinPolicy())"
+        "description": "Cassandra Load Balancing Policy. Default = new TokenAwarePolicy(new DCAwareRoundRobinPolicy())",
+        "type": "string"
       },
       "cassandra.retry.policy": {
         "envName": null,
         "propertyName": "cassandra.retry.policy",
         "defaultValue": "DEFAULT",
-        "description": "Cassandra Retry Policy. Default = DefaultRetryPolicy.INSTANCE"
+        "description": "Cassandra Retry Policy. Default = DefaultRetryPolicy.INSTANCE",
+        "type": "string"
       },
       "cassandra.reconnection.policy": {
         "envName": null,
         "propertyName": "cassandra.reconnection.policy",
         "defaultValue": "DEFAULT",
-        "description": "Cassandra Reconnection Policy. Default = new ExponentialReconnectionPolicy(1000, 10 * 60 * 1000)"
+        "description": "Cassandra Reconnection Policy. Default = new ExponentialReconnectionPolicy(1000, 10 * 60 * 1000)",
+        "type": "string"
       },
       "cassandra.speculative.execution.policy": {
         "envName": null,
         "propertyName": "cassandra.speculative.execution.policy",
         "defaultValue": "DEFAULT",
-        "description": "Cassandra Speculative Execution Policy. Default = NoSpeculativeExecutionPolicy.INSTANCE"
+        "description": "Cassandra Speculative Execution Policy. Default = NoSpeculativeExecutionPolicy.INSTANCE",
+        "type": "string"
       },
       "cassandra.interpreter.parallelism": {
         "envName": null,
         "propertyName": "cassandra.interpreter.parallelism",
         "defaultValue": "10",
-        "description": "Cassandra interpreter parallelism.Default = 10"
+        "description": "Cassandra interpreter parallelism.Default = 10",
+        "type": "number"
       },
       "cassandra.max.schema.agreement.wait.second": {
         "envName": null,
         "propertyName": "cassandra.max.schema.agreement.wait.second",
         "defaultValue": "10",
-        "description": "Cassandra max schema agreement wait in second.Default = ProtocolOptions.DEFAULT_MAX_SCHEMA_AGREEMENT_WAIT_SECONDS"
+        "description": "Cassandra max schema agreement wait in second.Default = ProtocolOptions.DEFAULT_MAX_SCHEMA_AGREEMENT_WAIT_SECONDS",
+        "type": "number"
       },
       "cassandra.pooling.new.connection.threshold.local": {
         "envName": null,
         "propertyName": "cassandra.pooling.new.connection.threshold.local",
         "defaultValue": "100",
-        "description": "Cassandra new connection threshold local. Protocol V2 and below default = 100 Protocol V3 and above default = 800"
+        "description": "Cassandra new connection threshold local. Protocol V2 and below default = 100 Protocol V3 and above default = 800",
+        "type": "number"
       },
       "cassandra.pooling.new.connection.threshold.remote": {
         "envName": null,
         "propertyName": "cassandra.pooling.new.connection.threshold.remote",
         "defaultValue": "100",
-        "description": "Cassandra new connection threshold remove. Protocol V2 and below default = 100 Protocol V3 and above default = 200"
+        "description": "Cassandra new connection threshold remove. Protocol V2 and below default = 100 Protocol V3 and above default = 200",
+        "type": "number"
       },
       "cassandra.pooling.core.connection.per.host.local": {
         "envName": null,
         "propertyName": "cassandra.pooling.core.connection.per.host.local",
         "defaultValue": "2",
-        "description": "Cassandra core connection per host local. Protocol V2 and below default = 2 Protocol V3 and above default = 1"
+        "description": "Cassandra core connection per host local. Protocol V2 and below default = 2 Protocol V3 and above default = 1",
+        "type": "number"
       },
       "cassandra.pooling.core.connection.per.host.remote": {
         "envName": null,
         "propertyName": "cassandra.pooling.core.connection.per.host.remote",
         "defaultValue": "1",
-        "description": "Cassandra core connection per host remove. Protocol V2 and below default = 1 Protocol V3 and above default = 1"
+        "description": "Cassandra core connection per host remove. Protocol V2 and below default = 1 Protocol V3 and above default = 1",
+        "type": "number"
       },
       "cassandra.pooling.max.connection.per.host.local": {
         "envName": null,
         "propertyName": "cassandra.pooling.max.connection.per.host.local",
         "defaultValue": "8",
-        "description": "Cassandra max connection per host local. Protocol V2 and below default = 8 Protocol V3 and above default = 1"
+        "description": "Cassandra max connection per host local. Protocol V2 and below default = 8 Protocol V3 and above default = 1",
+        "type": "number"
       },
       "cassandra.pooling.max.connection.per.host.remote": {
         "envName": null,
         "propertyName": "cassandra.pooling.max.connection.per.host.remote",
         "defaultValue": "2",
-        "description": "Cassandra max connection per host remote. Protocol V2 and below default = 2 Protocol V3 and above default = 1"
+        "description": "Cassandra max connection per host remote. Protocol V2 and below default = 2 Protocol V3 and above default = 1",
+        "type": "number"
       },
       "cassandra.pooling.max.request.per.connection.local": {
         "envName": null,
         "propertyName": "cassandra.pooling.max.request.per.connection.local",
         "defaultValue": "1024",
-        "description": "Cassandra max request per connection local. Protocol V2 and below default = 128 Protocol V3 and above default = 1024"
+        "description": "Cassandra max request per connection local. Protocol V2 and below default = 128 Protocol V3 and above default = 1024",
+        "type": "number"
       },
       "cassandra.pooling.max.request.per.connection.remote": {
         "envName": null,
         "propertyName": "cassandra.pooling.max.request.per.connection.remote",
         "defaultValue": "256",
-        "description": "Cassandra max request per connection remote. Protocol V2 and below default = 128 Protocol V3 and above default = 256"
+        "description": "Cassandra max request per connection remote. Protocol V2 and below default = 128 Protocol V3 and above default = 256",
+        "type": "number"
       },
       "cassandra.pooling.idle.timeout.seconds": {
         "envName": null,
         "propertyName": "cassandra.pooling.idle.timeout.seconds",
         "defaultValue": "120",
-        "description": "Cassandra idle time out in seconds. Default = 120"
+        "description": "Cassandra idle time out in seconds. Default = 120",
+        "type": "number"
       },
       "cassandra.pooling.pool.timeout.millisecs": {
         "envName": null,
         "propertyName": "cassandra.pooling.pool.timeout.millisecs",
         "defaultValue": "5000",
-        "description": "Cassandra pool time out in millisecs. Default = 5000"
+        "description": "Cassandra pool time out in millisecs. Default = 5000",
+        "type": "number"
       },
       "cassandra.pooling.heartbeat.interval.seconds": {
         "envName": null,
         "propertyName": "cassandra.pooling.heartbeat.interval.seconds",
         "defaultValue": "30",
-        "description": "Cassandra pool heartbeat interval in secs. Default = 30"
+        "description": "Cassandra pool heartbeat interval in secs. Default = 30",
+        "type": "number"
       },
       "cassandra.query.default.consistency": {
         "envName": null,
         "propertyName": "cassandra.query.default.consistency",
         "defaultValue": "ONE",
-        "description": "Cassandra query default consistency level. Default = ONE"
+        "description": "Cassandra query default consistency level. Default = ONE",
+        "type": "string"
       },
       "cassandra.query.default.serial.consistency": {
         "envName": null,
         "propertyName": "cassandra.query.default.serial.consistency",
         "defaultValue": "SERIAL",
-        "description": "Cassandra query default serial consistency level. Default = SERIAL"
+        "description": "Cassandra query default serial consistency level. Default = SERIAL",
+        "type": "string"
       },
       "cassandra.query.default.fetchSize": {
         "envName": null,
         "propertyName": "cassandra.query.default.fetchSize",
         "defaultValue": "5000",
-        "description": "Cassandra query default fetch size. Default = 5000"
+        "description": "Cassandra query default fetch size. Default = 5000",
+        "type": "number"
       },
       "cassandra.socket.connection.timeout.millisecs": {
         "envName": null,
         "propertyName": "cassandra.socket.connection.timeout.millisecs",
         "defaultValue": "5000",
-        "description": "Cassandra socket default connection timeout in millisecs. Default = 5000"
+        "description": "Cassandra socket default connection timeout in millisecs. Default = 5000",
+        "type": "number"
       },
       "cassandra.socket.read.timeout.millisecs": {
         "envName": null,
         "propertyName": "cassandra.socket.read.timeout.millisecs",
         "defaultValue": "12000",
-        "description": "Cassandra socket read timeout in millisecs. Default = 12000"
+        "description": "Cassandra socket read timeout in millisecs. Default = 12000",
+        "type": "number"
       },
       "cassandra.socket.tcp.no_delay": {
         "envName": null,
         "propertyName": "cassandra.socket.tcp.no_delay",
-        "defaultValue": "true",
-        "description": "Cassandra socket TCP no delay. Default = true"
+        "defaultValue": true,
+        "description": "Cassandra socket TCP no delay. Default = true",
+        "type": "checkbox"
       }
     },
     "editor": {

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/155a55b5/docs/development/writing_zeppelin_interpreter.md
----------------------------------------------------------------------
diff --git a/docs/development/writing_zeppelin_interpreter.md b/docs/development/writing_zeppelin_interpreter.md
index 17081a6..6ba24bc 100644
--- a/docs/development/writing_zeppelin_interpreter.md
+++ b/docs/development/writing_zeppelin_interpreter.md
@@ -61,13 +61,15 @@ Here is an example of `interpreter-setting.json` on your own interpreter.
         "envName": null,
         "propertyName": "property.1.name",
         "defaultValue": "propertyDefaultValue",
-        "description": "Property description"
+        "description": "Property description",
+        "type": "textarea"
       },
       "properties2": {
         "envName": PROPERTIES_2,
         "propertyName": null,
         "defaultValue": "property2DefaultValue",
-        "description": "Property 2 description"
+        "description": "Property 2 description",
+        "type": "textarea"
       }, ...
     },
     "editor": {

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/155a55b5/docs/usage/rest_api/interpreter.md
----------------------------------------------------------------------
diff --git a/docs/usage/rest_api/interpreter.md b/docs/usage/rest_api/interpreter.md
index 02c7ed8..4869866 100644
--- a/docs/usage/rest_api/interpreter.md
+++ b/docs/usage/rest_api/interpreter.md
@@ -75,12 +75,15 @@ The role of registered interpreters, settings and interpreters group are describ
       "className": "org.apache.zeppelin.spark.SparkInterpreter",
       "properties": {
         "spark.executor.memory": {
+          "name": "spark.executor.memory",
           "defaultValue": "1g",
-          "description": "Executor memory per worker instance. ex) 512m, 32g"
+          "description": "Executor memory per worker instance. ex) 512m, 32g",
+          "type": "string"
         },
         "spark.cores.max": {
           "defaultValue": "",
-          "description": "Total number of cores to use. Empty value uses all available core."
+          "description": "Total number of cores to use. Empty value uses all available core.",
+          "type": "number"
         },
       },
       "path": "/zeppelin/interpreter/spark"
@@ -91,8 +94,10 @@ The role of registered interpreters, settings and interpreters group are describ
       "className": "org.apache.zeppelin.spark.SparkSqlInterpreter",
       "properties": {
         "zeppelin.spark.maxResult": {
+          "name": "zeppelin.spark.maxResult",
           "defaultValue": "1000",
-          "description": "Max number of Spark SQL result to display."
+          "description": "Max number of Spark SQL result to display.",
+          "type": "number"
         }
       },
       "path": "/zeppelin/interpreter/spark"
@@ -153,8 +158,16 @@ The role of registered interpreters, settings and interpreters group are describ
       "name": "spark",
       "group": "spark",
       "properties": {
-        "spark.cores.max": "",
-        "spark.executor.memory": "1g",
+        "spark.cores.max": {
+          "name": "",
+          "value": "spark.cores.max",
+          "type": "number"
+        },
+        "spark.executor.memory": {
+          "name": "spark.executor.memory",
+          "value": "1g",
+          "type": "string"
+        }
       },
       "interpreterGroup": [
         {
@@ -215,7 +228,11 @@ The role of registered interpreters, settings and interpreters group are describ
     "name": "Markdown setting name",
     "group": "md",
     "properties": {
-      "propname": "propvalue"
+      "propname": {
+        "name": "propname",
+        "value": "propvalue",
+        "type": "textarea"
+      }
     },
     "interpreterGroup": [
       {
@@ -270,7 +287,10 @@ The role of registered interpreters, settings and interpreters group are describ
   "name": "Markdown setting name",
   "group": "md",
   "properties": {
-    "propname": "propvalue"
+    "propname": {
+      "name": "propname",
+      "value": "propvalue",
+      "type": "textarea"
   },
   "interpreterGroup": [
     {
@@ -302,7 +322,10 @@ The role of registered interpreters, settings and interpreters group are describ
     "name": "Markdown setting name",
     "group": "md",
     "properties": {
-      "propname": "propvalue"
+      "propname": {
+        "name": "propname",
+        "value": "propvalue",
+        "type": "textarea"
     },
     "interpreterGroup": [
       {
@@ -353,7 +376,10 @@ The role of registered interpreters, settings and interpreters group are describ
   "name": "Markdown setting name",
   "group": "md",
   "properties": {
-    "propname": "Otherpropvalue"
+    "propname": {
+      "name": "propname",
+      "value": "Otherpropvalue",
+      "type": "textarea"
   },
   "interpreterGroup": [
     {
@@ -385,7 +411,10 @@ The role of registered interpreters, settings and interpreters group are describ
     "name": "Markdown setting name",
     "group": "md",
     "properties": {
-      "propname": "Otherpropvalue"
+      "propname": {
+        "name": "propname",
+        "value": "Otherpropvalue",
+        "type": "textarea"
     },
     "interpreterGroup": [
       {
@@ -541,3 +570,37 @@ The role of registered interpreters, settings and interpreters group are describ
     </tr>
   </table>
   
+<br/>
+### Get available types for property
+  <table class="table-configuration">
+    <col width="200">
+    <tr>
+      <td>Description</td>
+      <td>This ```GET``` method returns available types for interpreter property.</td>
+    </tr>
+    <tr>
+      <td>URL</td>
+      <td>```http://[zeppelin-server]:[zeppelin-port]/api/interpreter/property/types```</td>
+    </tr>
+    <tr>
+      <td>Success code</td>
+      <td>200</td>
+    </tr>
+    <tr>
+      <td>Fail code</td>
+      <td> 500 </td>
+    </tr>
+    <tr>
+      <td>Sample JSON response</td>
+        <td>
+          <pre>
+{
+  "status": "OK",
+  "body": [ "textarea", "string", ...
+  ]
+}            
+          </pre>
+        </td>
+    </td>        
+  </table>  
+  

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/155a55b5/elasticsearch/src/main/resources/interpreter-setting.json
----------------------------------------------------------------------
diff --git a/elasticsearch/src/main/resources/interpreter-setting.json b/elasticsearch/src/main/resources/interpreter-setting.json
index 18200ae..6fac719 100644
--- a/elasticsearch/src/main/resources/interpreter-setting.json
+++ b/elasticsearch/src/main/resources/interpreter-setting.json
@@ -8,43 +8,50 @@
         "envName": "ELASTICSEARCH_HOST",
         "propertyName": "elasticsearch.host",
         "defaultValue": "localhost",
-        "description": "The host for Elasticsearch"
+        "description": "The host for Elasticsearch",
+        "type": "string"
       },
       "elasticsearch.port": {
         "envName": "ELASTICSEARCH_PORT",
         "propertyName": "elasticsearch.port",
         "defaultValue": "9300",
-        "description": "The port for Elasticsearch"
+        "description": "The port for Elasticsearch",
+        "type": "number"
       },
       "elasticsearch.client.type": {
         "envName": "ELASTICSEARCH_CLIENT_TYPE",
         "propertyName": "elasticsearch.client.type",
         "defaultValue": "transport",
-        "description": "The type of client for Elasticsearch (transport or http)"
+        "description": "The type of client for Elasticsearch (transport or http)",
+        "type": "string"
       },
       "elasticsearch.cluster.name": {
         "envName": "ELASTICSEARCH_CLUSTER_NAME",
         "propertyName": "elasticsearch.cluster.name",
         "defaultValue": "elasticsearch",
-        "description": "The cluster name for Elasticsearch"
+        "description": "The cluster name for Elasticsearch",
+        "type": "string"
       },
       "elasticsearch.result.size": {
         "envName": "ELASTICSEARCH_RESULT_SIZE",
         "propertyName": "elasticsearch.result.size",
         "defaultValue": "10",
-        "description": "The size of the result set of a search query"
+        "description": "The size of the result set of a search query",
+        "type": "number"
       },
       "elasticsearch.basicauth.username": {
         "envName": "ELASTICSEARCH_BASIC_AUTH_USERNAME",
         "propertyName": "elasticsearch.basicauth.username",
         "defaultValue": "",
-        "description": "Username for a basic authentication"
+        "description": "Username for a basic authentication",
+        "type": "string"
       },
       "elasticsearch.basicauth.password": {
         "envName": "ELASTICSEARCH_BASIC_AUTH_PASSWORD",
         "propertyName": "elasticsearch.basicauth.password",
         "defaultValue": "",
-        "description": "Password for a basic authentication"
+        "description": "Password for a basic authentication",
+        "type": "password"
       }
     },
     "editor": {

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/155a55b5/file/src/main/resources/interpreter-setting.json
----------------------------------------------------------------------
diff --git a/file/src/main/resources/interpreter-setting.json b/file/src/main/resources/interpreter-setting.json
index b4f9199..ebe5cf6 100644
--- a/file/src/main/resources/interpreter-setting.json
+++ b/file/src/main/resources/interpreter-setting.json
@@ -8,19 +8,22 @@
         "envName": null,
         "propertyName": "hdfs.url",
         "defaultValue": "http://localhost:50070/webhdfs/v1/",
-        "description": "The URL for WebHDFS"
+        "description": "The URL for WebHDFS",
+        "type": "url"
       },
       "hdfs.user": {
         "envName": null,
         "propertyName": "hdfs.user",
         "defaultValue": "hdfs",
-        "description": "The WebHDFS user"
+        "description": "The WebHDFS user",
+        "type": "string"
       },
       "hdfs.maxlength": {
         "envName": null,
         "propertyName": "hdfs.maxlength",
         "defaultValue": "1000",
-        "description": "Maximum number of lines of results fetched"
+        "description": "Maximum number of lines of results fetched",
+        "type": "number"
       }
     },
     "editor": {

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/155a55b5/flink/src/main/resources/interpreter-setting.json
----------------------------------------------------------------------
diff --git a/flink/src/main/resources/interpreter-setting.json b/flink/src/main/resources/interpreter-setting.json
index 0d4cf6b..f1a04bf 100644
--- a/flink/src/main/resources/interpreter-setting.json
+++ b/flink/src/main/resources/interpreter-setting.json
@@ -8,13 +8,15 @@
         "envName": "host",
         "propertyName": null,
         "defaultValue": "local",
-        "description": "host name of running JobManager. 'local' runs flink in local mode."
+        "description": "host name of running JobManager. 'local' runs flink in local mode.",
+        "type": "string"
       },
       "port": {
         "envName": "port",
         "propertyName": null,
         "defaultValue": "6123",
-        "description": "port of running JobManager."
+        "description": "port of running JobManager.",
+        "type": "number"
       }
     },
     "editor": {

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/155a55b5/geode/src/main/resources/interpreter-setting.json
----------------------------------------------------------------------
diff --git a/geode/src/main/resources/interpreter-setting.json b/geode/src/main/resources/interpreter-setting.json
index f67cfef..2a0a81d 100644
--- a/geode/src/main/resources/interpreter-setting.json
+++ b/geode/src/main/resources/interpreter-setting.json
@@ -8,19 +8,22 @@
         "envName": null,
         "propertyName": "geode.locator.host",
         "defaultValue": "localhost",
-        "description": "The Geode Locator Host."
+        "description": "The Geode Locator Host.",
+        "type": "string"
       },
       "geode.locator.port": {
         "envName": null,
         "propertyName": "geode.locator.port",
         "defaultValue": "10334",
-        "description": "The Geode Locator Port."
+        "description": "The Geode Locator Port.",
+        "type": "number"
       },
       "geode.max.result": {
         "envName": null,
         "propertyName": "geode.max.result",
         "defaultValue": "1000",
-        "description": "Max number of OQL result to display."
+        "description": "Max number of OQL result to display.",
+        "type": "number"
       }
     },
     "editor": {

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/155a55b5/groovy/src/main/resources/interpreter-setting.json
----------------------------------------------------------------------
diff --git a/groovy/src/main/resources/interpreter-setting.json b/groovy/src/main/resources/interpreter-setting.json
index 552f600..45aab84 100644
--- a/groovy/src/main/resources/interpreter-setting.json
+++ b/groovy/src/main/resources/interpreter-setting.json
@@ -8,7 +8,8 @@
         "envName": null,
         "propertyName": "GROOVY_CLASSES",
         "defaultValue": "",
-        "description": "The path for custom groovy classes location. If empty `./interpreter/groovy/classes`"
+        "description": "The path for custom groovy classes location. If empty `./interpreter/groovy/classes`",
+        "type": "textarea"
       }
     }
   }

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/155a55b5/hbase/src/main/resources/interpreter-setting.json
----------------------------------------------------------------------
diff --git a/hbase/src/main/resources/interpreter-setting.json b/hbase/src/main/resources/interpreter-setting.json
index aa44295..28dedcc 100644
--- a/hbase/src/main/resources/interpreter-setting.json
+++ b/hbase/src/main/resources/interpreter-setting.json
@@ -8,17 +8,20 @@
         "envName": "HBASE_HOME",
         "propertyName": "hbase.home",
         "defaultValue": "/usr/lib/hbase/",
-        "description": "Installation directory of HBase"
+        "description": "Installation directory of HBase",
+        "type": "string"
       },
       "hbase.ruby.sources": {
         "propertyName": "hbase.ruby.sources",
         "defaultValue": "lib/ruby",
-        "description": "Path to Ruby scripts relative to 'hbase.home'"
+        "description": "Path to Ruby scripts relative to 'hbase.home'",
+        "type": "string"
       },
       "zeppelin.hbase.test.mode": {
         "propertyName": "zeppelin.hbase.test.mode",
-        "defaultValue": "false",
-        "description": "Disable checks for unit and manual tests"
+        "defaultValue": false,
+        "description": "Disable checks for unit and manual tests",
+        "type": "checkbox"
       }
     },
     "editor": {

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/155a55b5/helium-dev/src/main/resources/interpreter-setting.json
----------------------------------------------------------------------
diff --git a/helium-dev/src/main/resources/interpreter-setting.json b/helium-dev/src/main/resources/interpreter-setting.json
index b3a7010..5146178 100644
--- a/helium-dev/src/main/resources/interpreter-setting.json
+++ b/helium-dev/src/main/resources/interpreter-setting.json
@@ -8,7 +8,8 @@
         "envName": "PORT",
         "propertyName": "port",
         "defaultValue": "jdbc:postgresql://localhost:5432/",
-        "description": "The URL for JDBC."
+        "description": "The URL for JDBC.",
+        "type": "string"
       }
     },
     "editor": {

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/155a55b5/ignite/src/main/resources/interpreter-setting.json
----------------------------------------------------------------------
diff --git a/ignite/src/main/resources/interpreter-setting.json b/ignite/src/main/resources/interpreter-setting.json
index 92cf75b..2342a90 100644
--- a/ignite/src/main/resources/interpreter-setting.json
+++ b/ignite/src/main/resources/interpreter-setting.json
@@ -8,25 +8,29 @@
         "envName": null,
         "propertyName": "ignite.addresses",
         "defaultValue": "127.0.0.1:47500..47509",
-        "description": "Comma separated list of addresses (e.g. 127.0.0.1:47500 or 127.0.0.1:47500..47509)"
+        "description": "Comma separated list of addresses (e.g. 127.0.0.1:47500 or 127.0.0.1:47500..47509)",
+        "type": "textarea"
       },
       "ignite.clientMode": {
         "envName": null,
         "propertyName": "ignite.clientMode",
-        "defaultValue": "true",
-        "description": "Client mode. true or false"
+        "defaultValue": true,
+        "description": "Client mode. true or false",
+        "type": "checkbox"
       },
       "ignite.config.url": {
         "envName": null,
         "propertyName": "ignite.config.url",
         "defaultValue": "",
-        "description": "Configuration URL. Overrides all other settings."
+        "description": "Configuration URL. Overrides all other settings.",
+        "type": "url"
       },
       "ignite.peerClassLoadingEnabled": {
         "envName": null,
         "propertyName": "ignite.peerClassLoadingEnabled",
-        "defaultValue": "true",
-        "description": "Peer class loading enabled. True or false"
+        "defaultValue": true,
+        "description": "Peer class loading enabled. True or false",
+        "type": "checkbox"
       }
     }
   },
@@ -39,7 +43,8 @@
         "envName": null,
         "propertyName": "ignite.jdbc.url",
         "defaultValue": "jdbc:ignite:cfg://default-ignite-jdbc.xml",
-        "description": "Ignite JDBC connection URL."
+        "description": "Ignite JDBC connection URL.",
+        "type": "string"
         }
     }   
   }

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/155a55b5/jdbc/src/main/resources/interpreter-setting.json
----------------------------------------------------------------------
diff --git a/jdbc/src/main/resources/interpreter-setting.json b/jdbc/src/main/resources/interpreter-setting.json
index 21ff685..561719e 100644
--- a/jdbc/src/main/resources/interpreter-setting.json
+++ b/jdbc/src/main/resources/interpreter-setting.json
@@ -8,19 +8,22 @@
         "envName": null,
         "propertyName": "default.url",
         "defaultValue": "jdbc:postgresql://localhost:5432/",
-        "description": "The URL for JDBC."
+        "description": "The URL for JDBC.",
+        "type": "string"
       },
       "default.user": {
         "envName": null,
         "propertyName": "default.user",
         "defaultValue": "gpadmin",
-        "description": "The JDBC user name"
+        "description": "The JDBC user name",
+        "type": "string"
       },
       "default.password": {
         "envName": null,
         "propertyName": "default.password",
         "defaultValue": "",
-        "description": "The JDBC user password"
+        "description": "The JDBC user password",
+        "type": "password"
       },
       "default.completer.ttlInSeconds": {
         "envName": null,
@@ -32,19 +35,22 @@
         "envName": null,
         "propertyName": "default.driver",
         "defaultValue": "org.postgresql.Driver",
-        "description": "JDBC Driver Name"
+        "description": "JDBC Driver Name",
+        "type": "string"
       },
       "default.completer.schemaFilters": {
         "envName": null,
         "propertyName": "default.completer.schemaFilters",
         "defaultValue": "",
-        "description": "–°omma separated schema (schema = catalog = database) filters to get metadata for completions. Supports '%' symbol is equivalent to any set of characters. (ex. prod_v_%,public%,info)"
+        "description": "–°omma separated schema (schema = catalog = database) filters to get metadata for completions. Supports '%' symbol is equivalent to any set of characters. (ex. prod_v_%,public%,info)",
+        "type": "textarea"
       },
       "default.precode": {
         "envName": null,
         "propertyName": "zeppelin.jdbc.precode",
         "defaultValue": "",
-        "description": "SQL which executes while opening connection"
+        "description": "SQL which executes while opening connection",
+        "type": "textarea"
       },
       "default.splitQueries": {
         "envName": null,
@@ -56,37 +62,43 @@
         "envName": null,
         "propertyName": "common.max_count",
         "defaultValue": "1000",
-        "description": "Max number of SQL result to display."
+        "description": "Max number of SQL result to display.",
+        "type": "number"
       },
       "zeppelin.jdbc.auth.type": {
         "envName": null,
         "propertyName": "zeppelin.jdbc.auth.type",
         "defaultValue": "",
-        "description": "If auth type is needed, Example: KERBEROS"
+        "description": "If auth type is needed, Example: KERBEROS",
+        "type": "string"
       },
       "zeppelin.jdbc.concurrent.use": {
         "envName": null,
         "propertyName": "zeppelin.jdbc.concurrent.use",
-        "defaultValue": "true",
-        "description": "Use parallel scheduler"
+        "defaultValue": true,
+        "description": "Use parallel scheduler",
+        "type": "checkbox"
       },
       "zeppelin.jdbc.concurrent.max_connection": {
         "envName": null,
         "propertyName": "zeppelin.jdbc.concurrent.max_connection",
         "defaultValue": "10",
-        "description": "Number of concurrent execution"
+        "description": "Number of concurrent execution",
+        "type": "number"
       },
       "zeppelin.jdbc.keytab.location": {
         "envName": null,
         "propertyName": "zeppelin.jdbc.keytab.location",
         "defaultValue": "",
-        "description": "Kerberos keytab location"
+        "description": "Kerberos keytab location",
+        "type": "string"
       },
       "zeppelin.jdbc.principal": {
         "envName": null,
         "propertyName": "zeppelin.jdbc.principal",
         "defaultValue": "",
-        "description": "Kerberos principal"
+        "description": "Kerberos principal",
+        "type": "string"
       }
     },
     "editor": {

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/155a55b5/kylin/src/main/resources/interpreter-setting.json
----------------------------------------------------------------------
diff --git a/kylin/src/main/resources/interpreter-setting.json b/kylin/src/main/resources/interpreter-setting.json
index a8c0d32..f5f79f9 100644
--- a/kylin/src/main/resources/interpreter-setting.json
+++ b/kylin/src/main/resources/interpreter-setting.json
@@ -8,43 +8,50 @@
         "envName": null,
         "propertyName": "kylin.api.url",
         "defaultValue": "http://localhost:7070/kylin/api/query",
-        "description": "Kylin API"
+        "description": "Kylin API",
+        "type": "url"
       },
       "kylin.api.user": {
         "envName": null,
         "propertyName": "kylin.api.user",
         "defaultValue": "ADMIN",
-        "description": "Kylin username"
+        "description": "Kylin username",
+        "type": "string"
       },
       "kylin.api.password": {
         "envName": null,
         "propertyName": "kylin.api.password",
         "defaultValue": "KYLIN",
-        "description": "Kylin password"
+        "description": "Kylin password",
+        "type": "password"
       },
       "kylin.query.project": {
         "envName": null,
         "propertyName": "kylin.query.project",
         "defaultValue": "learn_kylin",
-        "description": "Default Kylin project name"
+        "description": "Default Kylin project name",
+        "type": "textarea"
       },
       "kylin.query.offset": {
         "envName": null,
         "propertyName": "kylin.query.offset",
         "defaultValue": "0",
-        "description": "Kylin query offset"
+        "description": "Kylin query offset",
+        "type": "number"
       },
       "kylin.query.limit": {
         "envName": null,
         "propertyName": "kylin.query.limit",
         "defaultValue": "5000",
-        "description": "Kylin query limit"
+        "description": "Kylin query limit",
+        "type": "number"
       },
       "kylin.query.ispartial": {
         "envName": null,
         "propertyName": "kylin.query.ispartial",
-        "defaultValue": "true",
-        "description": "Kylin query partial flag, deprecated"
+        "defaultValue": true,
+        "description": "Kylin query partial flag, deprecated",
+        "type": "checkbox"
       }
     },
     "editor": {

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/155a55b5/lens/src/main/resources/interpreter-setting.json
----------------------------------------------------------------------
diff --git a/lens/src/main/resources/interpreter-setting.json b/lens/src/main/resources/interpreter-setting.json
index 427b01f..5d345e6 100644
--- a/lens/src/main/resources/interpreter-setting.json
+++ b/lens/src/main/resources/interpreter-setting.json
@@ -7,44 +7,51 @@
       "zeppelin.lens.run.concurrent": {
         "envName": null,
         "propertyName": "zeppelin.lens.run.concurrent",
-        "defaultValue": "true",
-        "description": "Run concurrent Lens Sessions"
+        "defaultValue": true,
+        "description": "Run concurrent Lens Sessions",
+        "type": "checkbox"
       },
       "zeppelin.lens.maxThreads": {
         "envName": null,
         "propertyName": "zeppelin.lens.maxThreads",
         "defaultValue": "10",
-        "description": "If concurrency is true then how many threads?"
+        "description": "If concurrency is true then how many threads?",
+        "type": "number"
       },
       "zeppelin.lens.maxResults": {
         "envName": null,
         "propertyName": "zeppelin.lens.maxResults",
         "defaultValue": "1000",
-        "description": "max number of rows to display"
+        "description": "max number of rows to display",
+        "type": "number"
       },
       "lens.server.base.url": {
         "envName": null,
         "propertyName": "lens.server.base.url",
         "defaultValue": "http://<hostname>:<port>/lensapi",
-        "description": "The URL for Lens Server"
+        "description": "The URL for Lens Server",
+        "type": "url"
       },
       "lens.client.dbname": {
         "envName": null,
         "propertyName": "lens.client.dbname",
         "defaultValue": "default",
-        "description": "The database schema name"
+        "description": "The database schema name",
+        "type": "string"
       },
       "lens.query.enable.persistent.resultset": {
         "envName": null,
         "propertyName": "lens.query.enable.persistent.resultset",
-        "defaultValue": "false",
-        "description": "Apache Lens to persist result in HDFS?"
+        "defaultValue": false,
+        "description": "Apache Lens to persist result in HDFS?",
+        "type": "checkbox"
       },
       "lens.session.cluster.user": {
         "envName": null,
         "propertyName": "lens.session.cluster.user",
         "defaultValue": "default",
-        "description": "Hadoop cluster username"
+        "description": "Hadoop cluster username",
+        "type": "string"
       }
     }
   }

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/155a55b5/livy/src/main/resources/interpreter-setting.json
----------------------------------------------------------------------
diff --git a/livy/src/main/resources/interpreter-setting.json b/livy/src/main/resources/interpreter-setting.json
index 8d3dea0..a5673c7 100644
--- a/livy/src/main/resources/interpreter-setting.json
+++ b/livy/src/main/resources/interpreter-setting.json
@@ -9,88 +9,105 @@
         "envName": "ZEPPELIN_LIVY_HOST_URL",
         "propertyName": "zeppelin.livy.url",
         "defaultValue": "http://localhost:8998",
-        "description": "The URL for Livy Server."
+        "description": "The URL for Livy Server.",
+        "type": "url"
       },
       "zeppelin.livy.session.create_timeout": {
         "envName": "ZEPPELIN_LIVY_SESSION_CREATE_TIMEOUT",
         "propertyName": "zeppelin.livy.session.create_timeout",
         "defaultValue": "120",
-        "description": "Livy Server create session timeout (seconds)."
+        "description": "Livy Server create session timeout (seconds).",
+        "type": "number"
       },
       "livy.spark.driver.cores": {
         "propertyName": "livy.spark.driver.cores",
         "defaultValue": "",
-        "description": "Driver cores. ex) 1, 2"
+        "description": "Driver cores. ex) 1, 2",
+        "type": "number"
       },
       "livy.spark.driver.memory": {
         "propertyName": "livy.spark.driver.memory",
         "defaultValue": "",
-        "description": "Driver memory. ex) 512m, 32g"
+        "description": "Driver memory. ex) 512m, 32g",
+        "type": "string"
       },
       "livy.spark.executor.instances": {
         "propertyName": "livy.spark.executor.instances",
         "defaultValue": "",
-        "description": "Executor instances. ex) 1, 4"
+        "description": "Executor instances. ex) 1, 4",
+        "type": "number"
       },
       "livy.spark.executor.cores": {
         "propertyName": "livy.spark.executor.cores",
         "defaultValue": "",
-        "description": "Num cores per executor. ex) 1, 4"
+        "description": "Num cores per executor. ex) 1, 4",
+        "type": "number"
       },
       "livy.spark.executor.memory": {
         "propertyName": "livy.spark.executor.memory",
         "defaultValue": "",
-        "description": "Executor memory per worker instance. ex) 512m, 32g"
+        "description": "Executor memory per worker instance. ex) 512m, 32g",
+        "type": "string"
       },
       "livy.spark.dynamicAllocation.enabled": {
         "propertyName": "livy.spark.dynamicAllocation.enabled",
-        "defaultValue": "",
-        "description": "Use dynamic resource allocation"
+        "defaultValue": false,
+        "description": "Use dynamic resource allocation",
+        "type": "checkbox"
       },
       "livy.spark.dynamicAllocation.cachedExecutorIdleTimeout": {
         "propertyName": "livy.spark.dynamicAllocation.cachedExecutorIdleTimeout",
         "defaultValue": "",
-        "description": "Remove an executor which has cached data blocks"
+        "description": "Remove an executor which has cached data blocks",
+        "type": "string"
       },
       "livy.spark.dynamicAllocation.minExecutors": {
         "propertyName": "livy.spark.dynamicAllocation.minExecutors",
         "defaultValue": "",
-        "description": "Lower bound for the number of executors if dynamic allocation is enabled."
+        "description": "Lower bound for the number of executors if dynamic allocation is enabled.",
+        "type": "number"
       },
       "livy.spark.dynamicAllocation.initialExecutors": {
         "propertyName": "livy.spark.dynamicAllocation.initialExecutors",
         "defaultValue": "",
-        "description": "Initial number of executors to run if dynamic allocation is enabled."
+        "description": "Initial number of executors to run if dynamic allocation is enabled.",
+        "type": "number"
       },
       "livy.spark.dynamicAllocation.maxExecutors": {
         "propertyName": "livy.spark.dynamicAllocation.maxExecutors",
         "defaultValue": "",
-        "description": "Upper bound for the number of executors if dynamic allocation is enabled."
+        "description": "Upper bound for the number of executors if dynamic allocation is enabled.",
+        "type": "number"
       },
       "zeppelin.livy.principal": {
         "propertyName": "zeppelin.livy.principal",
         "defaultValue": "",
-        "description": "Kerberos principal to authenticate livy"
+        "description": "Kerberos principal to authenticate livy",
+        "type": "string"
       },
       "zeppelin.livy.keytab": {
         "propertyName": "zeppelin.livy.keytab",
         "defaultValue": "",
-        "description": "Kerberos keytab to authenticate livy"
+        "description": "Kerberos keytab to authenticate livy",
+        "type": "textarea"
       },
       "zeppelin.livy.pull_status.interval.millis": {
         "propertyName": "zeppelin.livy.pull_status.interval.millis",
         "defaultValue": "1000",
-        "description": "The interval for checking paragraph execution status"
+        "description": "The interval for checking paragraph execution status",
+        "type": "number"
       },
       "livy.spark.jars.packages": {
         "propertyName": "livy.spark.jars.packages",
         "defaultValue": "",
-        "description": "Adding extra libraries to livy interpreter"
+        "description": "Adding extra libraries to livy interpreter",
+        "type": "textarea"
       },
       "zeppelin.livy.displayAppInfo": {
         "propertyName": "zeppelin.livy.displayAppInfo",
-        "defaultValue": "false",
-        "description": "Whether display app info"
+        "defaultValue": false,
+        "description": "Whether display app info",
+        "type": "checkbox"
       }
     },
     "option": {
@@ -116,17 +133,20 @@
         "envName": "ZEPPELIN_LIVY_MAXRESULT",
         "propertyName": "zeppelin.livy.spark.sql.maxResult",
         "defaultValue": "1000",
-        "description": "Max number of Spark SQL result to display."
+        "description": "Max number of Spark SQL result to display.",
+        "type": "number"
       },
       "zeppelin.livy.spark.sql.field.truncate": {
         "propertyName": "zeppelin.livy.spark.sql.field.truncate",
-        "defaultValue": "true",
-        "description": "If true, truncate field values longer than 20 characters."
+        "defaultValue": true,
+        "description": "If true, truncate field values longer than 20 characters.",
+        "type": "checkbox"
       },
       "zeppelin.livy.concurrentSQL": {
         "propertyName": "zeppelin.livy.concurrentSQL",
-        "defaultValue": "false",
-        "description": "Execute multiple SQL concurrently if set true."
+        "defaultValue": false,
+        "description": "Execute multiple SQL concurrently if set true.",
+        "type": "checkbox"
       }
     },
     "option": {

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/155a55b5/markdown/src/main/resources/interpreter-setting.json
----------------------------------------------------------------------
diff --git a/markdown/src/main/resources/interpreter-setting.json b/markdown/src/main/resources/interpreter-setting.json
index 9e670da..9819210 100644
--- a/markdown/src/main/resources/interpreter-setting.json
+++ b/markdown/src/main/resources/interpreter-setting.json
@@ -8,7 +8,8 @@
         "envName": "MARKDOWN_PARSER_TYPE",
         "propertyName": "markdown.parser.type",
         "defaultValue": "pegdown",
-        "description": "Markdown Parser Type. Available values: pegdown, markdown4j. Default = pegdown"
+        "description": "Markdown Parser Type. Available values: pegdown, markdown4j. Default = pegdown",
+        "type": "string"
       }
     },
     "editor": {

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/155a55b5/pig/src/main/resources/interpreter-setting.json
----------------------------------------------------------------------
diff --git a/pig/src/main/resources/interpreter-setting.json b/pig/src/main/resources/interpreter-setting.json
index c1eb69a..058e71b 100644
--- a/pig/src/main/resources/interpreter-setting.json
+++ b/pig/src/main/resources/interpreter-setting.json
@@ -8,25 +8,29 @@
         "envName": null,
         "propertyName": "zeppelin.pig.execType",
         "defaultValue": "mapreduce",
-        "description": "local | mapreduce | tez_local | tez | spark_local | spark"
+        "description": "local | mapreduce | tez_local | tez | spark_local | spark",
+        "type": "string"
       },
       "zeppelin.pig.includeJobStats": {
         "envName": null,
         "propertyName": "zeppelin.pig.includeJobStats",
-        "defaultValue": "false",
-        "description": "flag to include job stats in output"
+        "defaultValue": false,
+        "description": "flag to include job stats in output",
+        "type": "checkbox"
       },
       "SPARK_MASTER": {
         "envName": "SPARK_MASTER",
         "propertyName": "SPARK_MASTER",
         "defaultValue": "local",
-        "description": "local | yarn-client"
+        "description": "local | yarn-client",
+        "type": "string"
       },
       "SPARK_JAR": {
         "envName": "SPARK_JAR",
         "propertyName": "SPARK_JAR",
         "defaultValue": "",
-        "description": "spark assembly jar uploaded in hdfs"
+        "description": "spark assembly jar uploaded in hdfs",
+        "type": "textarea"
       }
     },
     "editor": {
@@ -43,7 +47,8 @@
         "envName": null,
         "propertyName": "zeppelin.pig.maxResult",
         "defaultValue": "1000",
-        "description": "max row number for %pig.query"
+        "description": "max row number for %pig.query",
+        "type": "number"
       }
     },
     "editor": {

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/155a55b5/python/src/main/resources/interpreter-setting.json
----------------------------------------------------------------------
diff --git a/python/src/main/resources/interpreter-setting.json b/python/src/main/resources/interpreter-setting.json
index bc4d4ec..3bc42b8 100644
--- a/python/src/main/resources/interpreter-setting.json
+++ b/python/src/main/resources/interpreter-setting.json
@@ -8,13 +8,15 @@
         "envName": null,
         "propertyName": "zeppelin.python",
         "defaultValue": "python",
-        "description": "Python directory. It is set to python by default.(assume python is in your $PATH)"
+        "description": "Python directory. It is set to python by default.(assume python is in your $PATH)",
+        "type": "string"
       },
       "zeppelin.python.maxResult": {
         "envName": null,
         "propertyName": "zeppelin.python.maxResult",
         "defaultValue": "1000",
-        "description": "Max number of dataframe rows to display."
+        "description": "Max number of dataframe rows to display.",
+        "type": "number"
       }
     },
     "editor": {

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/155a55b5/r/src/main/resources/interpreter-setting.json
----------------------------------------------------------------------
diff --git a/r/src/main/resources/interpreter-setting.json b/r/src/main/resources/interpreter-setting.json
index 3a751eb..b7dcaf7 100644
--- a/r/src/main/resources/interpreter-setting.json
+++ b/r/src/main/resources/interpreter-setting.json
@@ -6,19 +6,23 @@
     "properties": {
       "rhadoop.cmd": {
         "envName": "HADOOP_CMD",
-        "defaultValue": ""
+        "defaultValue": "",
+        "type": "textarea"
       },
       "rhadooop.streamingjar": {
         "envName": "HADOOP_STREAMING",
-        "defaultValue": ""
+        "defaultValue": "",
+        "type": "textarea"
       },
       "rscala.debug": {
         "envName": "RSCALA_DEBUG",
-        "defaultValue": "false"
+        "defaultValue": false,
+        "type": "checkbox"
       },
       "rscala.timeout": {
         "envName": "RSCALA_TIMEOUT",
-        "defaultValue": "60"
+        "defaultValue": "60",
+        "type": "number"
       }
     }
   },
@@ -29,19 +33,23 @@
     "properties": {
       "rhadoop.cmd": {
         "envName": "HADOOP_CMD",
-        "defaultValue": ""
+        "defaultValue": "",
+        "type": "textarea"
       },
       "rhadooop.streamingjar": {
         "envName": "HADOOP_STREAMING",
-        "defaultValue": ""
+        "defaultValue": "",
+        "type": "textarea"
       },
       "rscala.debug": {
         "envName": "RSCALA_DEBUG",
-        "defaultValue": "false"
+        "defaultValue": false,
+        "type": "checkbox"
       },
       "rscala.timeout": {
         "envName": "RSCALA_TIMEOUT",
-        "defaultValue": "60"
+        "defaultValue": "60",
+        "type": "number"
       }
     }
   }

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/155a55b5/scalding/src/main/resources/interpreter-setting.json
----------------------------------------------------------------------
diff --git a/scalding/src/main/resources/interpreter-setting.json b/scalding/src/main/resources/interpreter-setting.json
index a2efa4d..ca6cd92 100644
--- a/scalding/src/main/resources/interpreter-setting.json
+++ b/scalding/src/main/resources/interpreter-setting.json
@@ -7,12 +7,14 @@
       "args.string": {
         "envName": null,
         "defaultValue": "--local --repl",
-        "description": "Arguments for scalding REPL"
+        "description": "Arguments for scalding REPL",
+        "type": "textarea"
       },
       "max.open.instances": {
         "envName": null,
         "defaultValue": "50",
-        "description": "Maximum number of open interpreter instances"
+        "description": "Maximum number of open interpreter instances",
+        "type": "number"
       }
     }
   }

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/155a55b5/shell/src/main/resources/interpreter-setting.json
----------------------------------------------------------------------
diff --git a/shell/src/main/resources/interpreter-setting.json b/shell/src/main/resources/interpreter-setting.json
index c12b545..7728d5f 100644
--- a/shell/src/main/resources/interpreter-setting.json
+++ b/shell/src/main/resources/interpreter-setting.json
@@ -8,25 +8,29 @@
         "envName": "SHELL_COMMAND_TIMEOUT",
         "propertyName": "shell.command.timeout.millisecs",
         "defaultValue": "60000",
-        "description": "Shell command time out in millisecs. Default = 60000"
+        "description": "Shell command time out in millisecs. Default = 60000",
+        "type": "number"
       },
       "zeppelin.shell.auth.type": {
         "envName": null,
         "propertyName": "zeppelin.shell.auth.type",
         "defaultValue": "",
-        "description": "If auth type is needed, Example: KERBEROS"
+        "description": "If auth type is needed, Example: KERBEROS",
+        "type": "string"
       },
       "zeppelin.shell.keytab.location": {
         "envName": null,
         "propertyName": "zeppelin.shell.keytab.location",
         "defaultValue": "",
-        "description": "Kerberos keytab location"
+        "description": "Kerberos keytab location",
+        "type": "string"
       },
       "zeppelin.shell.principal": {
         "envName": null,
         "propertyName": "zeppelin.shell.principal",
         "defaultValue": "",
-        "description": "Kerberos principal"
+        "description": "Kerberos principal",
+        "type": "string"
       }
     },
     "editor": {

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/155a55b5/spark/src/main/java/org/apache/zeppelin/spark/SparkInterpreter.java
----------------------------------------------------------------------
diff --git a/spark/src/main/java/org/apache/zeppelin/spark/SparkInterpreter.java b/spark/src/main/java/org/apache/zeppelin/spark/SparkInterpreter.java
index 490e33f..c170e4e 100644
--- a/spark/src/main/java/org/apache/zeppelin/spark/SparkInterpreter.java
+++ b/spark/src/main/java/org/apache/zeppelin/spark/SparkInterpreter.java
@@ -26,17 +26,21 @@ import java.lang.reflect.InvocationTargetException;
 import java.lang.reflect.Method;
 import java.net.URL;
 import java.net.URLClassLoader;
-import java.util.*;
+import java.util.ArrayList;
+import java.util.LinkedList;
+import java.util.List;
+import java.util.Map;
+import java.util.NoSuchElementException;
+import java.util.Properties;
+import java.util.Set;
 import java.util.concurrent.atomic.AtomicInteger;
 
-import com.google.common.base.Joiner;
-
 import org.apache.commons.lang3.StringUtils;
 import org.apache.hadoop.security.UserGroupInformation;
+import org.apache.spark.SecurityManager;
 import org.apache.spark.SparkConf;
 import org.apache.spark.SparkContext;
 import org.apache.spark.SparkEnv;
-import org.apache.spark.SecurityManager;
 import org.apache.spark.api.java.JavaSparkContext;
 import org.apache.spark.repl.SparkILoop;
 import org.apache.spark.scheduler.ActiveJob;
@@ -46,23 +50,35 @@ import org.apache.spark.scheduler.SparkListenerJobStart;
 import org.apache.spark.sql.SQLContext;
 import org.apache.spark.ui.SparkUI;
 import org.apache.spark.ui.jobs.JobProgressListener;
-import org.apache.zeppelin.interpreter.*;
+import org.apache.zeppelin.interpreter.BaseZeppelinContext;
+import org.apache.zeppelin.interpreter.DefaultInterpreterProperty;
+import org.apache.zeppelin.interpreter.Interpreter;
+import org.apache.zeppelin.interpreter.InterpreterContext;
+import org.apache.zeppelin.interpreter.InterpreterException;
+import org.apache.zeppelin.interpreter.InterpreterHookRegistry;
+import org.apache.zeppelin.interpreter.InterpreterResult;
 import org.apache.zeppelin.interpreter.InterpreterResult.Code;
+import org.apache.zeppelin.interpreter.InterpreterUtils;
+import org.apache.zeppelin.interpreter.WrappedInterpreter;
+import org.apache.zeppelin.interpreter.remote.RemoteEventClientWrapper;
+import org.apache.zeppelin.interpreter.thrift.InterpreterCompletion;
 import org.apache.zeppelin.interpreter.util.InterpreterOutputStream;
 import org.apache.zeppelin.resource.ResourcePool;
 import org.apache.zeppelin.resource.WellKnownResourceName;
-import org.apache.zeppelin.interpreter.remote.RemoteEventClientWrapper;
-import org.apache.zeppelin.interpreter.thrift.InterpreterCompletion;
 import org.apache.zeppelin.scheduler.Scheduler;
 import org.apache.zeppelin.scheduler.SchedulerFactory;
 import org.apache.zeppelin.spark.dep.SparkDependencyContext;
 import org.apache.zeppelin.spark.dep.SparkDependencyResolver;
-import org.apache.zeppelin.user.AuthenticationInfo;
 import org.slf4j.Logger;
 import org.slf4j.LoggerFactory;
 
-import scala.*;
+import com.google.common.base.Joiner;
+import scala.Console;
 import scala.Enumeration.Value;
+import scala.None;
+import scala.Option;
+import scala.Some;
+import scala.Tuple2;
 import scala.collection.Iterator;
 import scala.collection.JavaConversions;
 import scala.collection.JavaConverters;
@@ -171,7 +187,7 @@ public class SparkInterpreter extends Interpreter {
         String jobUrl = getJobUrl(jobId);
         String noteId = Utils.getNoteId(jobGroupId);
         String paragraphId = Utils.getParagraphId(jobGroupId);
-          
+
         if (jobUrl != null && noteId != null && paragraphId != null) {
           RemoteEventClientWrapper eventClient = BaseZeppelinContext.getEventClient();
           Map<String, String> infos = new java.util.HashMap<>();
@@ -511,11 +527,14 @@ public class SparkInterpreter extends Interpreter {
   }
 
   private void setupConfForPySpark(SparkConf conf) {
-    String pysparkBasePath = new InterpreterProperty("SPARK_HOME", null, null, null).getValue();
+    Object pysparkBaseProperty =
+        new DefaultInterpreterProperty("SPARK_HOME", null, null).getValue();
+    String pysparkBasePath = pysparkBaseProperty != null ? pysparkBaseProperty.toString() : null;
     File pysparkPath;
     if (null == pysparkBasePath) {
       pysparkBasePath =
-              new InterpreterProperty("ZEPPELIN_HOME", "zeppelin.home", "../", null).getValue();
+          new DefaultInterpreterProperty("ZEPPELIN_HOME", "zeppelin.home", "../")
+              .getValue().toString();
       pysparkPath = new File(pysparkBasePath,
           "interpreter" + File.separator + "spark" + File.separator + "pyspark");
     } else {
@@ -560,11 +579,14 @@ public class SparkInterpreter extends Interpreter {
   }
 
   private void setupConfForSparkR(SparkConf conf) {
-    String sparkRBasePath = new InterpreterProperty("SPARK_HOME", null, null, null).getValue();
+    Object sparkRBaseProperty =
+        new DefaultInterpreterProperty("SPARK_HOME", null, null).getValue();
+    String sparkRBasePath = sparkRBaseProperty != null ? sparkRBaseProperty.toString() : null;
     File sparkRPath;
     if (null == sparkRBasePath) {
       sparkRBasePath =
-              new InterpreterProperty("ZEPPELIN_HOME", "zeppelin.home", "../", null).getValue();
+          new DefaultInterpreterProperty("ZEPPELIN_HOME", "zeppelin.home", "../")
+              .getValue().toString();
       sparkRPath = new File(sparkRBasePath,
               "interpreter" + File.separator + "spark" + File.separator + "R");
     } else {

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/155a55b5/spark/src/main/resources/interpreter-setting.json
----------------------------------------------------------------------
diff --git a/spark/src/main/resources/interpreter-setting.json b/spark/src/main/resources/interpreter-setting.json
index c8acc2f..e96265f 100644
--- a/spark/src/main/resources/interpreter-setting.json
+++ b/spark/src/main/resources/interpreter-setting.json
@@ -9,56 +9,64 @@
         "envName": null,
         "propertyName": "spark.executor.memory",
         "defaultValue": "",
-        "description": "Executor memory per worker instance. ex) 512m, 32g"
+        "description": "Executor memory per worker instance. ex) 512m, 32g",
+        "type": "string"
       },
       "args": {
         "envName": null,
         "propertyName": null,
         "defaultValue": "",
-        "description": "spark commandline args"
+        "description": "spark commandline args",
+        "type": "textarea"
       },
       "zeppelin.spark.useHiveContext": {
         "envName": "ZEPPELIN_SPARK_USEHIVECONTEXT",
         "propertyName": "zeppelin.spark.useHiveContext",
-        "defaultValue": "true",
-        "description": "Use HiveContext instead of SQLContext if it is true."
+        "defaultValue": true,
+        "description": "Use HiveContext instead of SQLContext if it is true.",
+        "type": "checkbox"
       },
       "spark.app.name": {
         "envName": "SPARK_APP_NAME",
-
         "propertyName": "spark.app.name",
         "defaultValue": "Zeppelin",
-        "description": "The name of spark application."
+        "description": "The name of spark application.",
+        "type": "string"
       },
       "zeppelin.spark.printREPLOutput": {
         "envName": null,
-        "propertyName": null,
-        "defaultValue": "true",
-        "description": "Print REPL output"
+        "propertyName": "zeppelin.spark.printREPLOutput",
+        "defaultValue": true,
+        "description": "Print REPL output",
+        "type": "checkbox"
       },
       "spark.cores.max": {
         "envName": null,
         "propertyName": "spark.cores.max",
         "defaultValue": "",
-        "description": "Total number of cores to use. Empty value uses all available core."
+        "description": "Total number of cores to use. Empty value uses all available core.",
+        "type": "number"
       },
       "zeppelin.spark.maxResult": {
         "envName": "ZEPPELIN_SPARK_MAXRESULT",
         "propertyName": "zeppelin.spark.maxResult",
         "defaultValue": "1000",
-        "description": "Max number of Spark SQL result to display."
+        "description": "Max number of Spark SQL result to display.",
+        "type": "number"
       },
       "master": {
         "envName": "MASTER",
         "propertyName": "spark.master",
         "defaultValue": "local[*]",
-        "description": "Spark master uri. ex) spark://masterhost:7077"
+        "description": "Spark master uri. ex) spark://masterhost:7077",
+        "type": "string"
       },
       "zeppelin.spark.unSupportedVersionCheck": {
         "envName": null,
         "propertyName": "zeppelin.spark.enableSupportedVersionCheck",
-        "defaultValue": "true",
-        "description": "Do not change - developer only setting, not for production use"
+        "defaultValue": true,
+        "description": "Do not change - developer only setting, not for production use",
+        "type": "checkbox"
       }
     },
     "editor": {
@@ -74,26 +82,30 @@
       "zeppelin.spark.concurrentSQL": {
         "envName": "ZEPPELIN_SPARK_CONCURRENTSQL",
         "propertyName": "zeppelin.spark.concurrentSQL",
-        "defaultValue": "false",
-        "description": "Execute multiple SQL concurrently if set true."
+        "defaultValue": false,
+        "description": "Execute multiple SQL concurrently if set true.",
+        "type": "checkbox"
       },
       "zeppelin.spark.sql.stacktrace": {
         "envName": "ZEPPELIN_SPARK_SQL_STACKTRACE",
         "propertyName": "zeppelin.spark.sql.stacktrace",
-        "defaultValue": "false",
-        "description": "Show full exception stacktrace for SQL queries if set to true."
+        "defaultValue": false,
+        "description": "Show full exception stacktrace for SQL queries if set to true.",
+        "type": "checkbox"
       },
       "zeppelin.spark.maxResult": {
         "envName": "ZEPPELIN_SPARK_MAXRESULT",
         "propertyName": "zeppelin.spark.maxResult",
         "defaultValue": "1000",
-        "description": "Max number of Spark SQL result to display."
+        "description": "Max number of Spark SQL result to display.",
+        "type": "number"
       },
       "zeppelin.spark.importImplicit": {
         "envName": "ZEPPELIN_SPARK_IMPORTIMPLICIT",
         "propertyName": "zeppelin.spark.importImplicit",
-        "defaultValue": "true",
-        "description": "Import implicits, UDF collection, and sql if set true. true by default."
+        "defaultValue": true,
+        "description": "Import implicits, UDF collection, and sql if set true. true by default.",
+        "type": "checkbox"
       }
     },
     "editor": {
@@ -110,13 +122,15 @@
         "envName": "ZEPPELIN_DEP_LOCALREPO",
         "propertyName": null,
         "defaultValue": "local-repo",
-        "description": "local repository for dependency loader"
+        "description": "local repository for dependency loader",
+        "type": "string"
       },
       "zeppelin.dep.additionalRemoteRepository": {
         "envName": null,
         "propertyName": null,
         "defaultValue": "spark-packages,http://dl.bintray.com/spark-packages/maven,false;",
-        "description": "A list of 'id,remote-repository-URL,is-snapshot;' for each remote repository."
+        "description": "A list of 'id,remote-repository-URL,is-snapshot;' for each remote repository.",
+        "type": "textarea"
       }
     },
     "editor": {
@@ -133,7 +147,8 @@
         "envName": "PYSPARK_PYTHON",
         "propertyName": null,
         "defaultValue": "python",
-        "description": "Python command to run pyspark with"
+        "description": "Python command to run pyspark with",
+        "type": "string"
       }
     },
     "editor": {

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/155a55b5/spark/src/main/sparkr-resources/interpreter-setting.json
----------------------------------------------------------------------
diff --git a/spark/src/main/sparkr-resources/interpreter-setting.json b/spark/src/main/sparkr-resources/interpreter-setting.json
index 6953b20..d0fbd3e 100644
--- a/spark/src/main/sparkr-resources/interpreter-setting.json
+++ b/spark/src/main/sparkr-resources/interpreter-setting.json
@@ -9,56 +9,64 @@
         "envName": null,
         "propertyName": "spark.executor.memory",
         "defaultValue": "",
-        "description": "Executor memory per worker instance. ex) 512m, 32g"
+        "description": "Executor memory per worker instance. ex) 512m, 32g",
+        "type": "string"
       },
       "args": {
         "envName": null,
         "propertyName": null,
         "defaultValue": "",
-        "description": "spark commandline args"
+        "description": "spark commandline args",
+        "type": "string"
       },
       "zeppelin.spark.useHiveContext": {
         "envName": "ZEPPELIN_SPARK_USEHIVECONTEXT",
         "propertyName": "zeppelin.spark.useHiveContext",
-        "defaultValue": "true",
-        "description": "Use HiveContext instead of SQLContext if it is true."
+        "defaultValue": true,
+        "description": "Use HiveContext instead of SQLContext if it is true.",
+        "type": "checkbox"
       },
       "spark.app.name": {
         "envName": "SPARK_APP_NAME",
-
         "propertyName": "spark.app.name",
         "defaultValue": "Zeppelin",
-        "description": "The name of spark application."
+        "description": "The name of spark application.",
+        "type": "string"
       },
       "zeppelin.spark.printREPLOutput": {
         "envName": null,
-        "propertyName": null,
-        "defaultValue": "true",
-        "description": "Print REPL output"
+        "propertyName": "zeppelin.spark.printREPLOutput",
+        "defaultValue": true,
+        "description": "Print REPL output",
+        "type": "checkbox"
       },
       "spark.cores.max": {
         "envName": null,
         "propertyName": "spark.cores.max",
         "defaultValue": "",
-        "description": "Total number of cores to use. Empty value uses all available core."
+        "description": "Total number of cores to use. Empty value uses all available core.",
+        "type": "number"
       },
       "zeppelin.spark.maxResult": {
         "envName": "ZEPPELIN_SPARK_MAXRESULT",
         "propertyName": "zeppelin.spark.maxResult",
         "defaultValue": "1000",
-        "description": "Max number of Spark SQL result to display."
+        "description": "Max number of Spark SQL result to display.",
+        "type": "number"
       },
       "master": {
         "envName": "MASTER",
         "propertyName": "spark.master",
         "defaultValue": "local[*]",
-        "description": "Spark master uri. ex) spark://masterhost:7077"
+        "description": "Spark master uri. ex) spark://masterhost:7077",
+        "type": "string"
       },
       "zeppelin.spark.unSupportedVersionCheck": {
         "envName": null,
         "propertyName": "zeppelin.spark.enableSupportedVersionCheck",
-        "defaultValue": "true",
-        "description": "Do not change - developer only setting, not for production use"
+        "defaultValue": true,
+        "description": "Do not change - developer only setting, not for production use",
+        "type": "checkbox"
       }
     },
     "editor": {
@@ -73,26 +81,30 @@
       "zeppelin.spark.concurrentSQL": {
         "envName": "ZEPPELIN_SPARK_CONCURRENTSQL",
         "propertyName": "zeppelin.spark.concurrentSQL",
-        "defaultValue": "false",
-        "description": "Execute multiple SQL concurrently if set true."
+        "defaultValue": false,
+        "description": "Execute multiple SQL concurrently if set true.",
+        "type": "checkbox"
       },
       "zeppelin.spark.sql.stacktrace": {
         "envName": "ZEPPELIN_SPARK_SQL_STACKTRACE",
         "propertyName": "zeppelin.spark.sql.stacktrace",
-        "defaultValue": "false",
-        "description": "Show full exception stacktrace for SQL queries if set to true."
+        "defaultValue": false,
+        "description": "Show full exception stacktrace for SQL queries if set to true.",
+        "type": "checkbox"
       },
       "zeppelin.spark.maxResult": {
         "envName": "ZEPPELIN_SPARK_MAXRESULT",
         "propertyName": "zeppelin.spark.maxResult",
         "defaultValue": "1000",
-        "description": "Max number of Spark SQL result to display."
+        "description": "Max number of Spark SQL result to display.",
+        "type": "number"
       },
       "zeppelin.spark.importImplicit": {
         "envName": "ZEPPELIN_SPARK_IMPORTIMPLICIT",
         "propertyName": "zeppelin.spark.importImplicit",
-        "defaultValue": "true",
-        "description": "Import implicits, UDF collection, and sql if set true. true by default."
+        "defaultValue": true,
+        "description": "Import implicits, UDF collection, and sql if set true. true by default.",
+        "type": "checkbox"
       }
     },
     "editor": {
@@ -108,13 +120,15 @@
         "envName": "ZEPPELIN_DEP_LOCALREPO",
         "propertyName": null,
         "defaultValue": "local-repo",
-        "description": "local repository for dependency loader"
+        "description": "local repository for dependency loader",
+        "type": "string"
       },
       "zeppelin.dep.additionalRemoteRepository": {
         "envName": null,
         "propertyName": null,
         "defaultValue": "spark-packages,http://dl.bintray.com/spark-packages/maven,false;",
-        "description": "A list of 'id,remote-repository-URL,is-snapshot;' for each remote repository."
+        "description": "A list of 'id,remote-repository-URL,is-snapshot;' for each remote repository.",
+        "type": "textarea"
       }
     },
     "editor": {
@@ -130,7 +144,8 @@
         "envName": "PYSPARK_PYTHON",
         "propertyName": null,
         "defaultValue": "python",
-        "description": "Python command to run pyspark with"
+        "description": "Python command to run pyspark with",
+        "type": "string"
       }
     },
     "editor": {
@@ -145,26 +160,30 @@
       "zeppelin.R.knitr": {
         "envName": "ZEPPELIN_R_KNITR",
         "propertyName": "zeppelin.R.knitr",
-        "defaultValue": "true",
-        "description": "whether use knitr or not"
+        "defaultValue": true,
+        "description": "whether use knitr or not",
+        "type": "checkbox"
       },
       "zeppelin.R.cmd": {
         "envName": "ZEPPELIN_R_CMD",
         "propertyName": "zeppelin.R.cmd",
         "defaultValue": "R",
-        "description": "R repl path"
+        "description": "R repl path",
+        "type": "string"
       },
       "zeppelin.R.image.width": {
         "envName": "ZEPPELIN_R_IMAGE_WIDTH",
         "propertyName": "zeppelin.R.image.width",
         "defaultValue": "100%",
-        "description": ""
+        "description": "",
+        "type": "number"
       },
       "zeppelin.R.render.options": {
         "envName": "ZEPPELIN_R_RENDER_OPTIONS",
         "propertyName": "zeppelin.R.render.options",
         "defaultValue": "out.format = 'html', comment = NA, echo = FALSE, results = 'asis', message = F, warning = F",
-        "description": ""
+        "description": "",
+        "type": "textarea"
       }
     },
     "editor": {


Mime
View raw message