首页 > 数据库 >springboot配置多数据源mysql,presto,hive等

springboot配置多数据源mysql,presto,hive等

时间:2022-10-13 10:34:15浏览次数:55  
标签:springboot presto 数据源 springframework private dataSourceCommonProperties datasou

下面案例是配置多数据源,两个及以上,但是主数据源只能是一个,默认mybatis使用的是主数据源

下面配置mysql为主数据源,通过注解@Primary标注

yaml文件配置:

spring:
  datasource:
    mysqlMain: #mysql主数据源,可关联mybatis
      type: com.alibaba.druid.pool.DruidDataSource
      url: jdbc:mysql://127.0.0.1:3306/aip?useUnicode=true&characterEncoding=utf-8&zeroDateTimeBehavior=convertToNull&allowMultiQueries=true&useSSL=false&rewriteBatchedStatements=true
      username: root
      password: root
      driver-class-name: com.mysql.cj.jdbc.Driver
    hive: #hive数据源
      url: jdbc:hive2://127.0.0.1:31000
      type: com.alibaba.druid.pool.DruidDataSource
      username: root
      password: root
      driver-class-name: org.apache.hive.jdbc.HiveDriver
    presto: #presto数据源
      url: jdbc:presto://127.0.0.1:30998/hive_dev/aip
      type: com.alibaba.druid.pool.DruidDataSource
      username: root
      password: root
      driver-class-name: com.facebook.presto.jdbc.PrestoDriver
    commonconfig: #连接池统一配置,应用到所有的数据源
      initialSize: 1
      minIdle: 1
      maxIdle: 5
      maxActive: 50
      maxWait: 10000
      timeBetweenEvictionRunsMillis: 10000
      minEvictableIdleTimeMillis: 300000
      validationQuery: select 'x'
      testWhileIdle: true
      testOnBorrow: false
      testOnReturn: false
      poolPreparedStatements: true
      maxOpenPreparedStatements: 20
      filters: stat

conf目录:

代码:

DataSourceCommonProperties:

package com.sangfor.api.config.druid;


import lombok.Data;
import org.springframework.boot.context.properties.ConfigurationProperties;

/**
 * @author: xmz
 */
@ConfigurationProperties(prefix = DataSourceCommonProperties.DS, ignoreUnknownFields = false)
@Data
public class DataSourceCommonProperties {
    final static String DS = "spring.datasource.commonconfig";
    private int initialSize = 10;
    private int minIdle;
    private int maxIdle;
    private int maxActive;
    private int maxWait;
    private int timeBetweenEvictionRunsMillis;
    private int minEvictableIdleTimeMillis;
    private String validationQuery;
    private boolean testWhileIdle;
    private boolean testOnBorrow;
    private boolean testOnReturn;
    private boolean poolPreparedStatements;
    private int maxOpenPreparedStatements;
    private String filters;
    private String mapperLocations;
    private String typeAliasPackage;
}

DataSourceProperties:

package com.sangfor.api.config.druid;


import java.util.Map;
import lombok.Data;
import org.springframework.boot.context.properties.ConfigurationProperties;

/**
 * @author: xmz
 */
@ConfigurationProperties(prefix = DataSourceProperties.DS, ignoreUnknownFields = false)
@Data
public class DataSourceProperties {
    final static String DS = "spring.datasource";
    private Map<String,String> mysqlMain;
    private Map<String,String> hive;
    private Map<String,String> presto;
    private Map<String,String> commonConfig;
}

MysqlMainDruidConfig:

package com.sangfor.api.config.druid;


import java.sql.SQLException;
import javax.sql.DataSource;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.boot.context.properties.EnableConfigurationProperties;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.Primary;
import com.alibaba.druid.pool.DruidDataSource;

/**
 * @author:  xmz
 */
@Configuration
@EnableConfigurationProperties({DataSourceProperties.class,DataSourceCommonProperties.class})//将配置类注入到bean容器,使ConfigurationProperties注解类生效
public class MysqlMainDruidConfig {

    private static Logger logger = LoggerFactory.getLogger(MysqlMainDruidConfig.class);

    @Autowired
    private DataSourceProperties dataSourceProperties;

    @Autowired
    private DataSourceCommonProperties dataSourceCommonProperties;

    @Primary //标明为主数据源,只能标识一个主数据源,mybatis连接默认主数据源
    @Bean("mysqlDruidDataSource") //新建bean实例
    @Qualifier("mysqlDruidDataSource")//标识
    public DataSource dataSource(){
        DruidDataSource datasource = new DruidDataSource();
        
        //配置数据源属性
        datasource.setUrl(dataSourceProperties.getMysqlMain().get("url"));
        datasource.setUsername(dataSourceProperties.getMysqlMain().get("username"));
        datasource.setPassword(dataSourceProperties.getMysqlMain().get("password"));
        datasource.setDriverClassName(dataSourceProperties.getMysqlMain().get("driver-class-name"));
		//配置统一属性
        datasource.setInitialSize(dataSourceCommonProperties.getInitialSize());
        datasource.setMinIdle(dataSourceCommonProperties.getMinIdle());
        datasource.setMaxActive(dataSourceCommonProperties.getMaxActive());
        datasource.setMaxWait(dataSourceCommonProperties.getMaxWait());
        datasource.setTimeBetweenEvictionRunsMillis
        		(dataSourceCommonProperties.getTimeBetweenEvictionRunsMillis());
		datasource.setMinEvictableIdleTimeMillis
                 (dataSourceCommonProperties.getMinEvictableIdleTimeMillis());
        datasource.setValidationQuery(dataSourceCommonProperties.getValidationQuery());
        datasource.setTestWhileIdle(dataSourceCommonProperties.isTestWhileIdle());
        datasource.setTestOnBorrow(dataSourceCommonProperties.isTestOnBorrow());
        datasource.setTestOnReturn(dataSourceCommonProperties.isTestOnReturn());
    	datasource.setPoolPreparedStatements
        		(dataSourceCommonProperties.isPoolPreparedStatements());
        try {
            datasource.setFilters(dataSourceCommonProperties.getFilters());
        } catch (SQLException e) {
            logger.error("Druid configuration initialization filter error.", e);
        }
        return datasource;
    }
}

MysqlMainJdbcBaseDaoImpl:

package com.sangfor.api.config.druid;


import javax.sql.DataSource;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.stereotype.Repository;

/**
 * @author: xmz
 */
@Repository
public class MysqlMainJdbcBaseDaoImpl {

    private JdbcTemplate jdbcTemplate;

    public JdbcTemplate getJdbcTemplate() {
        return jdbcTemplate;
    }

    @Autowired
    public void setJdbcTemplate(@Qualifier("mysqlDruidDataSource") DataSource dataSource) {
        this.jdbcTemplate = new JdbcTemplate(dataSource);
    }

}

PrestoDruidConfig:

package com.sangfor.api.config.druid;

/**
 * @author:  xmz
 */

import com.alibaba.druid.pool.DruidDataSource;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.boot.context.properties.EnableConfigurationProperties;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import javax.sql.DataSource;
import java.sql.SQLException;
import java.util.TimeZone;


@Configuration
@EnableConfigurationProperties({DataSourceProperties.class,DataSourceCommonProperties.class})//将配置类注入到bean容器,使ConfigurationProperties注解类生效
public class PrestoDruidConfig {

    private static Logger logger = LoggerFactory.getLogger(PrestoDruidConfig.class);

    @Autowired
    private DataSourceProperties dataSourceProperties;

    @Autowired
    private DataSourceCommonProperties dataSourceCommonProperties;

    @Bean("prestoDruidDataSource") //新建bean实例
    @Qualifier("prestoDruidDataSource")//标识
    public DataSource dataSource(){
        TimeZone.setDefault(TimeZone.getTimeZone("+08:00"));
        DruidDataSource datasource = new DruidDataSource();

        //配置数据源属性
        datasource.setUrl(dataSourceProperties.getPresto().get("url"));
        datasource.setUsername(dataSourceProperties.getPresto().get("username"));
//        datasource.setPassword(dataSourceProperties.getPresto().get("password"));
        datasource.setDriverClassName(dataSourceProperties.getPresto().get("driver-class-name"));
		//配置统一属性
        datasource.setInitialSize(dataSourceCommonProperties.getInitialSize());
        datasource.setMinIdle(dataSourceCommonProperties.getMinIdle());
        datasource.setMaxActive(dataSourceCommonProperties.getMaxActive());
        datasource.setMaxWait(dataSourceCommonProperties.getMaxWait());
        datasource.setTimeBetweenEvictionRunsMillis(dataSourceCommonProperties.getTimeBetweenEvictionRunsMillis());
        datasource.setMinEvictableIdleTimeMillis(dataSourceCommonProperties.getMinEvictableIdleTimeMillis());
        datasource.setValidationQuery(dataSourceCommonProperties.getValidationQuery());
        datasource.setTestWhileIdle(dataSourceCommonProperties.isTestWhileIdle());
        datasource.setTestOnBorrow(dataSourceCommonProperties.isTestOnBorrow());
        datasource.setTestOnReturn(dataSourceCommonProperties.isTestOnReturn());
        datasource.setPoolPreparedStatements(dataSourceCommonProperties.isPoolPreparedStatements());
        try {
            datasource.setFilters(dataSourceCommonProperties.getFilters());
        } catch (SQLException e) {
            logger.error("Druid configuration initialization filter error.", e);
        }
        return datasource;
    }
}

PrestoJdbcBaseDaoImpl:

package com.sangfor.api.config.druid;


import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.stereotype.Repository;

import javax.sql.DataSource;

/**
 * @author:  xmz
 */
@Repository
public class PrestoJdbcBaseDaoImpl {

    private JdbcTemplate jdbcTemplate;

    public JdbcTemplate getJdbcTemplate() {
        return jdbcTemplate;
    }

    @Autowired
    public void setJdbcTemplate(@Qualifier("prestoDruidDataSource") DataSource dataSource) {
        this.jdbcTemplate = new JdbcTemplate(dataSource);
    }

}

HiveDruidConfig:

package com.sangfor.api.config.druid;

/**
 * @author:  xmz
 */

import java.sql.SQLException;
import javax.sql.DataSource;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.boot.context.properties.EnableConfigurationProperties;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import com.alibaba.druid.pool.DruidDataSource;


@Configuration
@EnableConfigurationProperties({DataSourceProperties.class,DataSourceCommonProperties.class})//将配置类注入到bean容器,使ConfigurationProperties注解类生效
public class HiveDruidConfig {

    private static Logger logger = LoggerFactory.getLogger(HiveDruidConfig.class);

    @Autowired
    private DataSourceProperties dataSourceProperties;

    @Autowired
    private DataSourceCommonProperties dataSourceCommonProperties;

    @Bean("hiveDruidDataSource") //新建bean实例
    @Qualifier("hiveDruidDataSource")//标识
    public DataSource dataSource(){
        DruidDataSource datasource = new DruidDataSource();

        //配置数据源属性
        datasource.setUrl(dataSourceProperties.getHive().get("url"));
        datasource.setUsername(dataSourceProperties.getHive().get("username"));
        datasource.setPassword(dataSourceProperties.getHive().get("password"));
        datasource.setDriverClassName(dataSourceProperties.getHive().get("driver-class-name"));
		//配置统一属性
        datasource.setInitialSize(dataSourceCommonProperties.getInitialSize());
        datasource.setMinIdle(dataSourceCommonProperties.getMinIdle());
        datasource.setMaxActive(dataSourceCommonProperties.getMaxActive());
        datasource.setMaxWait(dataSourceCommonProperties.getMaxWait());
        datasource.setTimeBetweenEvictionRunsMillis(dataSourceCommonProperties.getTimeBetweenEvictionRunsMillis());
        datasource.setMinEvictableIdleTimeMillis(dataSourceCommonProperties.getMinEvictableIdleTimeMillis());
        datasource.setValidationQuery(dataSourceCommonProperties.getValidationQuery());
        datasource.setTestWhileIdle(dataSourceCommonProperties.isTestWhileIdle());
        datasource.setTestOnBorrow(dataSourceCommonProperties.isTestOnBorrow());
        datasource.setTestOnReturn(dataSourceCommonProperties.isTestOnReturn());
        datasource.setPoolPreparedStatements(dataSourceCommonProperties.isPoolPreparedStatements());
        try {
            datasource.setFilters(dataSourceCommonProperties.getFilters());
        } catch (SQLException e) {
            logger.error("Druid configuration initialization filter error.", e);
        }
        return datasource;
    }
}

HiveJdbcBaseDaoImpl:

package com.sangfor.api.config.druid;


import javax.sql.DataSource;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.stereotype.Repository;

/**
 * @author:  xmz
 */
@Repository
public class HiveJdbcBaseDaoImpl {

    private JdbcTemplate jdbcTemplate;

    public JdbcTemplate getJdbcTemplate() {
        return jdbcTemplate;
    }

    @Autowired
    public void setJdbcTemplate(@Qualifier("hiveDruidDataSource") DataSource dataSource) {
        this.jdbcTemplate = new JdbcTemplate(dataSource);
    }

}

配置已经完成,可以正常使用,测试新建hive的db,正常使用

public void createDB(String dbName, String dbPath) {
        String createDB = "CREATE DATABASE IF NOT EXISTS `"+dbName+"` location '"+dbPath + "'";
        try {
            hiveJdbcBaseDao.getJdbcTemplate().execute(createDB);
        } catch (DataAccessException dae) {
            throw new DataFilesException(DataFilesExceptionEnum.CREATE_HIVE_DB_FAIL,dae);
        }
    }

标签:springboot,presto,数据源,springframework,private,dataSourceCommonProperties,datasou
From: https://www.cnblogs.com/xiongmozhou/p/16787279.html

相关文章

  • Springboot 之 Filter 实现超大响应 JSON 数据压缩
    简介项目中,请求时发送超大json数据外;响应时也有可能返回超大json数据。上一篇实现了请求数据的gzip压缩。本篇通过filter实现对响应json数据的压缩。先了解一下......
  • 解决springBoot启动报错Failed to obtain JDBC Connection; nested exception is java
    FailedtoobtainJDBCConnection;nestedexceptionisjava.sql.SQLNonTransientConnectionException:CLIENT_PLUGIN_AUTHisrequired意思是获取JDBC连接失败,导致的......
  • SpringBoot 自定义注解 实现多数据源
    SpringBoot自定义注解实现多数据源前置学习需要了解注解、Aop、SpringBoot整合Mybatis的使用。数据准备基础项目代码:https://gitee.com/J_look/spring-boot-all-dem......
  • SpringBoot整合Kafka集群
    前言SpringBoot中集成Kafka,主要目的干啥呢,当然消息推送啦。不同系统之间,自身系统不同组件之间消息通信的一种方式,也可以是使用MQ。使用消息系统的目的主要就是为了解耦、......
  • Nebula Graph介绍和SpringBoot环境连接和查询
    NebulaGraph介绍和SpringBoot环境连接和查询转载请注明来源https://www.cnblogs.com/milton/p/16784098.html说明当前NebulaGraph的最新版本是3.2.1,根据官方的文档......
  • SpringBoot的yml多环境配置3种方法
    方式一:多个yml文件 步骤一、创建多个配置文件application.yml#主配置文件application-dev.yml#开发环境的配置application-prod.yml#生产环境的配置applic......
  • SpringBoot_RestFul风格CURD
    一、什么是RestFulREST(英文:RepresentationalStateTransfer,简称REST,意思:表述性状态转换,描述了一个架构样式的网络系统,比如web应用),是一种软件架构风格不是标准哦!一种软......
  • SpringBoot笔记
    Idea环境创建项目:   ......
  • SpringBoot+MyBatis Plus对Map中Date格式转换的处理
    在SpringBoot项目中,如何统一JSON格式化中的日期格式问题现在的关系型数据库例如PostgreSQL/MySQL,都已经对JSON类型提供相当丰富的功能,项目中对于不需要检索但是......
  • springboot Druid后台监控功能和过滤
    @ControllerpublicclassDruidConfig{@ConfigurationProperties(prefix="spring.datasource")@BeanpublicDataSourcedruidDataSource(){return......