diff --git a/README-ZH.md b/README-ZH.md index 19e680fc0..3f348efa9 100644 --- a/README-ZH.md +++ b/README-ZH.md @@ -64,4 +64,4 @@ Exchangis 抽象了一套统一的数据源和同步作业定义插件,允许 ## License -Exchangis is under the Apache 2.0 License. See the [License](../../../LICENSE) file for details. +Exchangis is under the Apache 2.0 License. See the [License](./LICENSE) file for details. diff --git a/README.md b/README.md index 8bd867ded..1c9d4fb0a 100644 --- a/README.md +++ b/README.md @@ -63,5 +63,5 @@ If you want to get the fastest response, please mention issue to us, or scan the ## License -Exchangis is under the Apache 2.0 License. See the [License](../../../LICENSE) file for details. +Exchangis is under the Apache 2.0 License. See the [License](./LICENSE) file for details. diff --git a/docs/en_US/ch1/exchangis_appconn_deploy_en.md b/docs/en_US/ch1/exchangis_appconn_deploy_en.md index 8867b265b..f42a85742 100644 --- a/docs/en_US/ch1/exchangis_appconn_deploy_en.md +++ b/docs/en_US/ch1/exchangis_appconn_deploy_en.md @@ -67,7 +67,7 @@ After the exchangis-appconn is installed and deployed, the following steps can b 2. Check whether the project is created synchronously on Exchangis. Successful creation means successful installation of appconn ![image](https://user-images.githubusercontent.com/27387830/169782337-678f2df0-080a-495a-b59f-a98c5a427cf8.png) -For more operation, please refer to [Exchangis 1.0 User Manual](https://github.com/WeBankFinTech/Exchangis/blob/dev-1.0.0/docs/en_US/ch1/exchangis_user_manual_cn.md) +For more operation, please refer to [Exchangis 1.0 User Manual](https://github.com/WeBankFinTech/Exchangis/blob/dev-1.0.0/docs/en_US/ch1/exchangis_user_manual_en.md) ### 5.Exchangis AppConn installation principle diff --git a/docs/en_US/ch1/exchangis_deploy_en.md b/docs/en_US/ch1/exchangis_deploy_en.md index d73a26173..354bfea8c 100644 --- a/docs/en_US/ch1/exchangis_deploy_en.md +++ b/docs/en_US/ch1/exchangis_deploy_en.md @@ -15,11 +15,11 @@ Exchangis installation is mainly divided into the following four steps : |------------------------------------------------------------------------------| ------ | --------------- | | MySQL (5.5+) | yes | [How to install mysql](https://www.runoob.com/mysql/mysql-install.html) | | JDK (1.8.0_141) | yes | [How to install JDK](https://www.runoob.com/java/java-environment-setup.html) | -| Hadoop(2.7.2,Other versions of Hadoop need to compile Linkis by themselves.) | yes | [Hadoop stand-alone deployment](https://linkis.apache.org/zh-CN/docs/latest/deployment/quick_deploy) ;[Hadoop distributed deployment](https://linkis.apache.org/zh-CN/docs/latest/deployment/quick_deploy) | -| Hive(2.3.3,Other versions of Hive need to compile Linkis by themselves.) | yes | [Hive quick installation](https://linkis.apache.org/zh-CN/docs/latest/deployment/quick_deploy) | +| Hadoop(2.7.2,Other versions of Hadoop need to compile Linkis by themselves.) | yes | [Hadoop stand-alone deployment](https://hadoop.apache.org/releases.html) ;[Hadoop distributed deployment](https://hadoop.apache.org/releases.html) | +| Hive(2.3.3,Other versions of Hive need to compile Linkis by themselves.) | yes | [Hive quick installation](https://hive.apache.org/downloads.html) | | SQOOP (1.4.6) | yes | [How to install Sqoop](https://sqoop.apache.org/docs/1.4.6/SqoopUserGuide.html) | -| DSS1.1.0 | yes | [How to install DSS](https://github.com/WeBankFinTech/DataSphereStudio-Doc/blob/main/zh_CN/%E5%AE%89%E8%A3%85%E9%83%A8%E7%BD%B2/DSS%E5%8D%95%E6%9C%BA%E9%83%A8%E7%BD%B2%E6%96%87%E6%A1%A3.md) | -| Linkis1.1.1 | yes | [How to install Linkis](https://linkis.apache.org/zh-CN/docs/latest/deployment/quick_deploy) | +| DSS1.1.0 | yes | [How to install DSS](https://github.com/WeBankFinTech/DataSphereStudio-Doc/blob/main/en_US/Installation_and_Deployment/DSS%26Linkis_one-click_deployment_document_stand-alone_version.md) | +| Linkis1.1.1 | yes | [How to install Linkis](https://linkis.apache.org/zh-CN/docs/latest/deployment/deploy-quick) | | Nginx | yes | [How to install Nginx](http://nginx.org/en/linux_packages.html) | Underlying component checking @@ -32,7 +32,7 @@ $\color{#FF0000}{Note: be sure to reinstall dss1.1.0, and the linkis version mus datasource enabled -By default, two services related to datasources (ps-data-source-manager, ps-metadatamanager) will not be started in the startup script of linkis. If you want to use datasource services, you can start them by modifying the export enable _ metadata _ manager = true value in $ linkis_conf_dir/linkis-env.sh. When the service is started and stopped through linkis-start-all.sh/linkis-stop-all.sh, the datasource service will be started and stopped. For more details about data sources, please refer to [Data Source Function Usage](https://linkis.apache.org/zh-CN/docs/1.1.0/deployment/start_metadatasource) +By default, two services related to datasources (ps-data-source-manager, ps-metadatamanager) will not be started in the startup script of linkis. If you want to use datasource services, you can start them by modifying the export enable _ metadata _ manager = true value in $ linkis_conf_dir/linkis-env.sh. When the service is started and stopped through linkis-start-all.sh/linkis-stop-all.sh, the datasource service will be started and stopped. For more details about data sources, please refer to [Data Source Function Usage](https://linkis.apache.org/zh-CN/docs/latest/user-guide/datasource-manual) #### 1.2 Create Linux users @@ -55,7 +55,7 @@ INSERT INTO `linkis_ps_dm_datasource_env` (`env_name`, `env_desc`, `datasource_t INSERT INTO `linkis_ps_dm_datasource_env` (`env_name`, `env_desc`, `datasource_type_id`, `parameter`, `create_time`, `create_user`, `modify_time`, `modify_user`) VALUES ('开发环境UAT', '开发环境UAT', 4, '{"uris":"thrift://${HIVE_METADATA_IP}:${HIVE_METADATA_PORT}", "hadoopConf":{"hive.metastore.execute.setugi":"true"}}', now(), NULL, now(), NULL); ``` -If the hive data source needs kerberos authentication when deployed, you need to specify a parameter keyTab in the parameter field of the Linkis_ps_dm_datasource_env table, and the way to obtain its value can be seen: [Setting and authenticating hive data source in linkis](https://linkis.apache.org/zh-CN/docs/1.1.1/deployment/start_metadatasource). +If the hive data source needs kerberos authentication when deployed, you need to specify a parameter keyTab in the parameter field of the Linkis_ps_dm_datasource_env table, and the way to obtain its value can be seen: [Setting and authenticating hive data source in linkis](https://linkis.apache.org/zh-CN/docs/latest/user-guide/datasource-manual). #### 1.4 Underlying component checking @@ -191,7 +191,7 @@ As shown in the figure below: #### 2.7.1 Get the front-end installation package -Exchangis has provided compiled front-end installation package by default, which can be downloaded and used directly :[Click to jump to the Release interface](https://github.com/WeBankFinTech/Exchangis/releases/download/release-1.0.0/web-dist.zip) +Exchangis has provided compiled front-end installation package by default, which can be downloaded and used directly :[Click to jump to the Release interface](https://osp-1257653870.cos.ap-guangzhou.myqcloud.com/WeDatasphere/Exchangis/exchangis1.0.0/dist.zip) You can also compile the exchange front-end by yourself and execute the following command in the exchanise root directory: @@ -277,7 +277,7 @@ If you want to use Exchangis1.0.0 front-end, you also need to install the DSS Ex ## 4. Linkis Sqoop engine installation and deployment -If you want to execute the Sqoop operation of Exchangis1.0.0 normally, you also need to install the Linkis Sqoop engine. Please refer to: : [Linkis Sqoop engine installation documentation ](https://linkis.staged.apache.org/zh-CN/docs/1.1.2/engine_usage/sqoop) +If you want to execute the Sqoop operation of Exchangis1.0.0 normally, you also need to install the Linkis Sqoop engine. Please refer to: : [Linkis Sqoop engine installation documentation ](https://linkis.apache.org/zh-CN/docs/latest/engine-usage/sqoop) ## 5. How to log in and use Exchangis diff --git a/docs/en_US/ch1/exchangis_sqoop_deploy_en.md b/docs/en_US/ch1/exchangis_sqoop_deploy_en.md index 4ee3ee810..9ef74d74c 100644 --- a/docs/en_US/ch1/exchangis_sqoop_deploy_en.md +++ b/docs/en_US/ch1/exchangis_sqoop_deploy_en.md @@ -74,4 +74,4 @@ cd {LINKIS_INSTALL_HOME}/links/sbin/ After the service is successfully started, the installation and deployment of sqoop will be completed. For a more detailed introduction of engineplugin, please refer to the following article. -https://linkis.apache.org/zh-CN/docs/latest/deployment/engine_conn_plugin_installation \ No newline at end of file +https://linkis.apache.org/zh-CN/docs/latest/architecture/computation-governance-services/engine/engine-conn \ No newline at end of file diff --git a/docs/zh_CN/ch1/exchangis_deploy_cn.md b/docs/zh_CN/ch1/exchangis_deploy_cn.md index edcdc1118..f6943fe21 100644 --- a/docs/zh_CN/ch1/exchangis_deploy_cn.md +++ b/docs/zh_CN/ch1/exchangis_deploy_cn.md @@ -15,11 +15,11 @@ Exchangis 的安装,主要分为以下四步: |---------------------------------------| ------ | --------------- | | MySQL (5.5+) | 必装 | [如何安装mysql](https://www.runoob.com/mysql/mysql-install.html) | | JDK (1.8.0_141) | 必装 | [如何安装JDK](https://www.runoob.com/java/java-environment-setup.html) | -| Hadoop(2.7.2,Hadoop 其他版本需自行编译 Linkis) | 必装 | [Hadoop单机部署](https://linkis.apache.org/zh-CN/docs/latest/deployment/quick_deploy) ;[Hadoop分布式部署](https://linkis.apache.org/zh-CN/docs/latest/deployment/quick_deploy) | -| Hive(2.3.3,Hive 其他版本需自行编译 Linkis) | 必装 | [Hive快速安装](https://linkis.apache.org/zh-CN/docs/latest/deployment/quick_deploy) | +| Hadoop(2.7.2,Hadoop 其他版本需自行编译 Linkis) | 必装 | [Hadoop单机部署](https://hadoop.apache.org/releases.html) ;[Hadoop分布式部署](https://hadoop.apache.org/releases.html) | +| Hive(2.3.3,Hive 其他版本需自行编译 Linkis) | 必装 | [Hive快速安装](https://hive.apache.org/downloads.html) | | SQOOP (1.4.6) | 必装 | [如何安装Sqoop](https://sqoop.apache.org/docs/1.4.6/SqoopUserGuide.html) | -| DSS1.1.0 | 必装 | [如何安装DSS](https://github.com/WeBankFinTech/DataSphereStudio-Doc/blob/main/zh_CN/%E5%AE%89%E8%A3%85%E9%83%A8%E7%BD%B2/DSS%26Linkis%E4%B8%80%E9%94%AE%E9%83%A8%E7%BD%B2%E6%96%87%E6%A1%A3%E5%8D%95%E6%9C%BA%E7%89%88.md) | -| Linkis1.1.1 | 必装 | [如何安装Linkis](https://linkis.apache.org/zh-CN/docs/latest/deployment/quick_deploy) | +| DSS1.1.0 | 必装 | [如何安装DSS](https://github.com/WeBankFinTech/DataSphereStudio-Doc/blob/main/en_US/Installation_and_Deployment/DSS%26Linkis_one-click_deployment_document_stand-alone_version.md) | +| Linkis1.1.1 | 必装 | [如何安装Linkis](https://linkis.apache.org/zh-CN/docs/latest/deployment/deploy-quick) | | Nginx | 必装 | [如何安装 Nginx](http://nginx.org/en/linux_packages.html) | 底层依赖组件检查 @@ -32,7 +32,7 @@ $\color{#FF0000}{注意:一定要使用最新版的dss1.1.0,及linkis1.1.1}$ datasource启用 -linkis的启动脚本中默认不会启动数据源相关的服务两个服务(ps-data-source-manager,ps-metadatamanager), 如果想使用数据源服务,可以通过如下方式进行开启: 修改$LINKIS_CONF_DIR/linkis-env.sh中的 export ENABLE_METADATA_MANAGER=true值为true。 通过linkis-start-all.sh/linkis-stop-all.sh 进行服务启停时,会进行数据源服务的启动与停止。关于数据源更多详情可参考[数据源功能使用](https://linkis.apache.org/zh-CN/docs/1.1.0/deployment/start_metadatasource) +linkis的启动脚本中默认不会启动数据源相关的服务两个服务(ps-data-source-manager,ps-metadatamanager), 如果想使用数据源服务,可以通过如下方式进行开启: 修改$LINKIS_CONF_DIR/linkis-env.sh中的 export ENABLE_METADATA_MANAGER=true值为true。 通过linkis-start-all.sh/linkis-stop-all.sh 进行服务启停时,会进行数据源服务的启动与停止。关于数据源更多详情可参考[数据源功能使用](https://linkis.apache.org/zh-CN/docs/latest/user-guide/datasource-manual) #### 1.2 创建 Linux 用户 @@ -57,7 +57,7 @@ INSERT INTO `linkis_ps_dm_datasource_env` (`env_name`, `env_desc`, `datasource_t INSERT INTO `linkis_ps_dm_datasource_env` (`env_name`, `env_desc`, `datasource_type_id`, `parameter`, `create_time`, `create_user`, `modify_time`, `modify_user`) VALUES ('开发环境UAT', '开发环境UAT', 4, '{"uris":"thrift://${HIVE_METADATA_IP}:${HIVE_METADATA_PORT}", "hadoopConf":{"hive.metastore.execute.setugi":"true"}}', now(), NULL, now(), NULL); ``` -如果hive数据源在部署时设置了需要进行kerberos方式认证,则需要在linkis_ps_dm_datasource_env表的parameter字段指定一个参数keyTab,其值的获取方式可见:[在Linkis中设置并认证hive数据源](https://linkis.apache.org/zh-CN/docs/1.1.1/deployment/start_metadatasource) +如果hive数据源在部署时设置了需要进行kerberos方式认证,则需要在linkis_ps_dm_datasource_env表的parameter字段指定一个参数keyTab,其值的获取方式可见:[在Linkis中设置并认证hive数据源](https://linkis.apache.org/zh-CN/docs/latest/user-guide/datasource-manual) #### 1.4 底层依赖组件检查 @@ -188,7 +188,7 @@ DATABASE={dbName} #### 2.7.1 获取前端安装包 -Exchangis 已默认提供了编译好的前端安装包,可直接下载使用:[点击下载前端安装包](https://github.com/WeBankFinTech/Exchangis/releases/download/release-1.0.0/web-dist.zip) +Exchangis 已默认提供了编译好的前端安装包,可直接下载使用:[点击下载前端安装包](https://osp-1257653870.cos.ap-guangzhou.myqcloud.com/WeDatasphere/Exchangis/exchangis1.0.0/dist.zip) 您也可以自行编译 Exchangis 前端,在 Exchangis 根目录下执行如下命令: @@ -227,8 +227,8 @@ Exchangis 已默认提供了编译好的前端安装包,可直接下载使用 server_name localhost; #charset koi8-r; #access_log /var/log/nginx/host.access.log main; - location /dist { - root /appcom/Install/exchangis/web; # Exchangis 前端部署目录 + location / { + root /appcom/Install/exchangis/web/dist; # Exchangis 前端部署目录 autoindex on; } @@ -275,7 +275,7 @@ Exchangis 已默认提供了编译好的前端安装包,可直接下载使用 ## 4. Linkis Sqoop 引擎安装部署 -如您想正常执行 Exchangis1.0.0 的 Sqoop作业,还需安装 Linkis Sqoop 引擎,请参考: [Linkis Sqoop 引擎插件安装文档](https://linkis.staged.apache.org/zh-CN/docs/1.1.2/engine_usage/sqoop) +如您想正常执行 Exchangis1.0.0 的 Sqoop作业,还需安装 Linkis Sqoop 引擎,请参考: [Linkis Sqoop 引擎插件安装文档](https://linkis.apache.org/zh-CN/docs/latest/engine-usage/sqoop) ## 5. 如何登录使用 Exchangis diff --git a/docs/zh_CN/ch1/exchangis_sqoop_deploy_cn.md b/docs/zh_CN/ch1/exchangis_sqoop_deploy_cn.md index b253bc362..1e457d825 100644 --- a/docs/zh_CN/ch1/exchangis_sqoop_deploy_cn.md +++ b/docs/zh_CN/ch1/exchangis_sqoop_deploy_cn.md @@ -73,4 +73,4 @@ cd {LINKIS_INSTALL_HOME}/links/sbin/ 待服务启动成功,至此,sqoop安装部署就完成了。 engineplugin更详细的介绍可以参看下面的文章。 -https://linkis.apache.org/zh-CN/docs/latest/deployment/engine_conn_plugin_installation +https://linkis.apache.org/zh-CN/docs/latest/architecture/computation-governance-services/engine/engine-conn diff --git a/web/README.md b/web/README.md index 43249b1fd..b7500960c 100644 --- a/web/README.md +++ b/web/README.md @@ -1,3 +1,3 @@ # Exchangis Web -基于[FES](https://winixt.gitee.io/fesjs/zh/)打造的管理系统 +基于[FES](https://webank.gitee.io/fes.js/)打造的管理系统