首页 > 其他分享 >12.17

12.17

时间:2025-01-14 19:55:09浏览次数:1  
标签:String 12.17 static apache import hbase public

1.实验目的

(1)理解HBase在Hadoop体系结构中的角色;

(2)熟练使用HBase操作常用的Shell命令;

(3)熟悉HBase操作常用的Java API。

2.实验平台

(1)操作系统:Linux(建议Ubuntu16.04或Ubuntu18.04);

(2)Hadoop版本:3.1.3;

(3)HBase版本:2.2.2;

(4)JDK版本:1.8;

(5)Java IDE:Eclipse。

3. 实验步骤

(一)编程实现以下指定功能,并用Hadoop提供的HBase Shell命令完成相同任务:

(1) 列出HBase所有的表的相关信息,例如表名;

 

(2) 在终端打印出指定的表的所有记录数据;

 

(3) 向已经创建好的表添加和删除指定的列族或列;

 

(4) 清空指定的表的所有记录数据;

 

(5) 统计表的行数。

 

 

(二)HBase数据库操作

1. 现有以下关系型数据库中的表和数据(见表14-3到表14-5),要求将其转换为适合于HBase存储的表并插入数据:

表14-3 学生表(Student)

学号(S_No)

姓名(S_Name)

性别(S_Sex)

年龄(S_Age)

2015001

Zhangsan

male

23

2015002

Mary

female

22

2015003

Lisi

male

24

 

 

 

表14-4 课程表(Course)

课程号(C_No)

课程名(C_Name)

学分(C_Credit)

123001

Math

2.0

123002

Computer Science

5.0

123003

English

3.0

 

表14-5 选课表(SC)

学号(SC_Sno)

课程号(SC_Cno)

成绩(SC_Score)

2015001

123001

86

2015001

123003

69

2015002

123002

77

2015002

123003

99

2015003

123001

98

2015003

123002

95

 

2. 请编程实现以下功能:

(1)createTable(String tableName, String[] fields)

创建表,参数tableName为表的名称,字符串数组fields为存储记录各个字段名称的数组。要求当HBase已经存在名为tableName的表的时候,先删除原有的表,然后再创建新的表。

import org.apache.hadoop.conf.Configuration;

import org.apache.hadoop.hbase.HBaseConfiguration;

import org.apache.hadoop.hbase.HColumnDescriptor;

import org.apache.hadoop.hbase.HTableDescriptor;

import org.apache.hadoop.hbase.TableName;

import org.apache.hadoop.hbase.client.Admin;

import org.apache.hadoop.hbase.client.Connection;

import org.apache.hadoop.hbase.client.ConnectionFactory;

 

import java.io.IOException;

 

public class CreateTable {

    public static Configuration configuration;

    public static Connection connection;

    public static Admin admin;

 

    public static void createTable(String tableName, String[] fields) throws IOException {

        init();

        TableName tablename = TableName.valueOf(tableName);

        if (admin.tableExists(tablename)) {

            System.out.println("table is exists!");

            admin.disableTable(tablename);

            admin.deleteTable(tablename);

        }

        HTableDescriptor hTableDescriptor = new HTableDescriptor(tablename);

        for (String str : fields) {

            HColumnDescriptor hColumnDescriptor = new HColumnDescriptor(str);

            hTableDescriptor.addFamily(hColumnDescriptor);

        }

        admin.createTable(hTableDescriptor);

        close();

    }

 

    public static void init() {

        configuration = HBaseConfiguration.create();

        configuration.set("hbase.rootdir", "hdfs://node1:8020/hbase");

        try {

            connection = ConnectionFactory.createConnection(configuration);

            admin = connection.getAdmin();

        } catch (IOException e) {

            e.printStackTrace();

        }

    }

 

    public static void close() {

        try {

            if (admin != null) {

                admin.close();

            }

            if (null != connection) {

                connection.close();

            }

        } catch (IOException e) {

            e.printStackTrace();

        }

    }

 

    public static void main(String[] args) {

        String[] fields = {"Score"};

        try {

            createTable("person", fields);

        } catch (IOException e) {

            e.printStackTrace();

        }

    }

}

 

(2)addRecord(String tableName, String row, String[] fields, String[] values)

向表tableName、行row(用S_Name表示)和字符串数组fields指定的单元格中添加对应的数据values。其中,fields中每个元素如果对应的列族下还有相应的列限定符的话,用“columnFamily:column”表示。例如,同时向“Math”、“Computer Science”、“English”三列添加成绩时,字符串数组fields为{“Score:Math”, ”Score:Computer Science”, ”Score:English”},数组values存储这三门课的成绩。

import org.apache.hadoop.conf.Configuration;

import org.apache.hadoop.hbase.HBaseConfiguration;

import org.apache.hadoop.hbase.TableName;

import org.apache.hadoop.hbase.client.*;

 

import java.io.IOException;

 

public class AddRecord {

    public static Configuration configuration;

    public static Connection connection;

    public static Admin admin;

 

    public static void addRecord(String tableName, String row, String[] fields, String[] values) throws IOException {

        init();

        Table table = connection.getTable(TableName.valueOf(tableName));

        for (int i = 0; i != fields.length; i++) {

            Put put = new Put(row.getBytes());

            String[] cols = fields[i].split(":");

            put.addColumn(cols[0].getBytes(), cols[1].getBytes(), values[i].getBytes());

            table.put(put);

        }

        table.close();

        close();

    }

 

    public static void init() {

        configuration = HBaseConfiguration.create();

        configuration.set("hbase.rootdir", "hdfs://node1:8020/hbase");

        try {

            connection = ConnectionFactory.createConnection(configuration);

            admin = connection.getAdmin();

        } catch (IOException e) {

            e.printStackTrace();

        }

    }

 

    public static void close() {

        try {

            if (admin != null) {

                admin.close();

            }

            if (null != connection) {

                connection.close();

            }

        } catch (IOException e) {

            e.printStackTrace();

        }

    }

 

    public static void main(String[] args) {

        String[] fields = {"Score:Math", "Score:Computer Science", "Score:English"};

        String[] values = {"99", "80", "100"};

        try {

            addRecord("person", "Score", fields, values);

        } catch (IOException e) {

            e.printStackTrace();

        }

 

    }

}

 

 

 

(3)scanColumn(String tableName, String column)

浏览表tableName某一列的数据,如果某一行记录中该列数据不存在,则返回null。要求当参数column为某一列族名称时,如果底下有若干个列限定符,则要列出每个列限定符代表的列的数据;当参数column为某一列具体名称(例如“Score:Math”)时,只需要列出该列的数据。

import org.apache.hadoop.conf.Configuration;

import org.apache.hadoop.hbase.Cell;

import org.apache.hadoop.hbase.CellUtil;

import org.apache.hadoop.hbase.HBaseConfiguration;

import org.apache.hadoop.hbase.TableName;

import org.apache.hadoop.hbase.client.*;

import org.apache.hadoop.hbase.util.Bytes;

 

import java.io.IOException;

 

public class ScanColumn {

    public static Configuration configuration;

    public static Connection connection;

    public static Admin admin;

 

    public static void scanColumn(String tableName, String column) throws IOException {

        init();

        Table table = connection.getTable(TableName.valueOf(tableName));

        Scan scan = new Scan();

        scan.addFamily(Bytes.toBytes(column));

        ResultScanner scanner = table.getScanner(scan);

        for (Result result = scanner.next(); result != null; result = scanner.next()) {

            showCell(result);

        }

        table.close();

        close();

    }

 

    public static void showCell(Result result) {

        Cell[] cells = result.rawCells();

        for (Cell cell : cells) {

            System.out.println("RowName:" + new String(CellUtil.cloneRow(cell)) + " ");

            System.out.println("Timetamp:" + cell.getTimestamp() + " ");

            System.out.println("column Family:" + new String(CellUtil.cloneFamily(cell)) + " ");

            System.out.println("row Name:" + new String(CellUtil.cloneQualifier(cell)) + " ");

            System.out.println("value:" + new String(CellUtil.cloneValue(cell)) + " ");

        }

    }

 

    public static void init() {

        configuration = HBaseConfiguration.create();

        configuration.set("hbase.rootdir", "hdfs://node1:8020/hbase");

        try {

            connection = ConnectionFactory.createConnection(configuration);

            admin = connection.getAdmin();

        } catch (IOException e) {

            e.printStackTrace();

        }

    }

 

    // 关闭连接

    public static void close() {

        try {

            if (admin != null) {

                admin.close();

            }

            if (null != connection) {

                connection.close();

            }

        } catch (IOException e) {

            e.printStackTrace();

        }

    }

 

    public static void main(String[] args) {

        try {

            scanColumn("person", "Score");

        } catch (IOException e) {

            e.printStackTrace();

        }

 

    }

}

 

(4)modifyData(String tableName, String row, String column)

修改表tableName,行row(可以用学生姓名S_Name表示),列column指定的单元格的数据。

import org.apache.hadoop.conf.Configuration;

import org.apache.hadoop.hbase.Cell;

import org.apache.hadoop.hbase.HBaseConfiguration;

import org.apache.hadoop.hbase.TableName;

import org.apache.hadoop.hbase.client.*;

 

import java.io.IOException;

 

public class ModifyData {

 

    public static long ts;

    public static Configuration configuration;

    public static Connection connection;

    public static Admin admin;

 

    public static void modifyData(String tableName, String row, String column, String val) throws IOException {

        init();

        Table table = connection.getTable(TableName.valueOf(tableName));

        Put put = new Put(row.getBytes());

        Scan scan = new Scan();

        ResultScanner resultScanner = table.getScanner(scan);

        for (Result r : resultScanner) {

            for (Cell cell : r.getColumnCells(row.getBytes(), column.getBytes())) {

                ts = cell.getTimestamp();

            }

        }

        put.addColumn(row.getBytes(), column.getBytes(), ts, val.getBytes());

        table.put(put);

        table.close();

        close();

    }

 

    public static void init() {

        configuration = HBaseConfiguration.create();

        configuration.set("hbase.rootdir", "hdfs://localhost:9000/hbase");

        try {

            connection = ConnectionFactory.createConnection(configuration);

            admin = connection.getAdmin();

        } catch (IOException e) {

            e.printStackTrace();

        }

    }

 

    public static void close() {

        try {

            if (admin != null) {

                admin.close();

            }

            if (null != connection) {

                connection.close();

            }

        } catch (IOException e) {

            e.printStackTrace();

        }

    }

 

    public static void main(String[] args) {

        try {

            modifyData("person", "Score", "Math", "100");

        } catch (IOException e) {

            e.printStackTrace();

        }

 

    }

}

 

 

(5)deleteRow(String tableName, String row)

删除表tableName中row指定的行的记录。

import org.apache.hadoop.conf.Configuration;

import org.apache.hadoop.hbase.Cell;

import org.apache.hadoop.hbase.HBaseConfiguration;

import org.apache.hadoop.hbase.TableName;

import org.apache.hadoop.hbase.client.*;

 

import java.io.IOException;

 

public class ModifyData {

 

    public static long ts;

    public static Configuration configuration;

    public static Connection connection;

    public static Admin admin;

 

    public static void modifyData(String tableName, String row, String column, String val) throws IOException {

        init();

        Table table = connection.getTable(TableName.valueOf(tableName));

        Put put = new Put(row.getBytes());

        Scan scan = new Scan();

        ResultScanner resultScanner = table.getScanner(scan);

        for (Result r : resultScanner) {

            for (Cell cell : r.getColumnCells(row.getBytes(), column.getBytes())) {

                ts = cell.getTimestamp();

            }

        }

        put.addColumn(row.getBytes(), column.getBytes(), ts, val.getBytes());

        table.put(put);

        table.close();

        close();

    }

 

    public static void init() {

        configuration = HBaseConfiguration.create();

        configuration.set("hbase.rootdir", "hdfs://localhost:9000/hbase");

        try {

            connection = ConnectionFactory.createConnection(configuration);

            admin = connection.getAdmin();

        } catch (IOException e) {

            e.printStackTrace();

        }

    }

 

    public static void close() {

        try {

            if (admin != null) {

                admin.close();

            }

            if (null != connection) {

                connection.close();

            }

        } catch (IOException e) {

            e.printStackTrace();

        }

    }

 

    public static void main(String[] args) {

        try {

            modifyData("person", "Score", "Math", "100");

        } catch (IOException e) {

            e.printStackTrace();

        }

 

    }

}

标签:String,12.17,static,apache,import,hbase,public
From: https://www.cnblogs.com/szm123/p/18671475

相关文章

  • 12.17
    一、数据结构与模型SQL数据库,作为关系型数据库的代表,采用结构化的数据模型,数据以表格形式存储,每个表格包含固定的列和数据类型,数据之间通过外键进行关联。这种结构化的数据存储方式使得SQL数据库在数据一致性和完整性方面表现出色。相比之下,NoSQL数据库则更加灵活,它支持多种数据......
  • 12.17
    3.1执行第一个spark程序 $ /opt/module/spark-2.1.1-bin-hadoop2.7/bin/spark-submit \--class org.apache.spark.examples.SparkPi \--master spark://hadoop102:7077 \--executor-memory 1G \--total-executor-cores 2 \/opt/module/spark-2.1.1-bin-hado......
  • 2024.12.17(周二)
    namespaceDatabase{partialclassLogin{///<summary>///必需的设计器变量。///</summary>privateSystem.ComponentModel.IContainercomponents=null;///<summary>///清理所有正在使用的资源。......
  • 12.17夜发电实录
    1随便找寻一个日期消失不可思议的世界深浅未知的乐园与故乡哇,还有纯爱充电。耗电给自己充电充电耗电给自己充电温暖而拖沓的浅白色的背后没有扣子的旗袍与系带的灰烬一般的冬日2太好了,准备拿舌头去舔[em]e400365[/em][em]e400365[/em][em]e400365[/em][em]e400365[/......
  • 12.17双向链表和循环链表
    1.思维导图2.单向循环链表1>程序代码:头文件:#ifndef__LOOPLINK_H__#define__LOOPLINK_H__#include<stdio.h>#include<stdlib.h>//构造节点数据类型typedefintDatatype;typedefstructnode{ union { intlen; Datatypedata; }; structnode*next;}......
  • 12.17 CW 模拟赛 赛时记录
    前言这一次又来更新比赛策略讲真打了这么久,还没有一个好的比赛策略真的是抽象回去吃点感冒药看题怎么\(\rm{T1\T2}\)是一个题\(\rm{T1}\)可能是\(\rm{dp}\)题\(\rm{T3}\)有些不好做\(\rm{T4}\)这种题一般都不会做,能骗多少是多少\(\rm{T1}\)思路转化题意......
  • 12.17
    实验4NoSQL和关系数据库的操作比较 1.实验目的(1)理解四种数据库(MySQL、HBase、Redis和MongoDB)的概念以及不同点;(2)熟练使用四种数据库操作常用的Shell命令;(3)熟悉四种数据库操作常用的JavaAPI。2.实验平台(1)操作系统:Linux(建议Ubuntu16.04或Ubuntu18.04);(2)Hadoop版本:3.1.3;(3)My......
  • 12.17学习总结
    1.结构体学习(学完哒)    2.写了英语作业~ 3.p1162题代码已经敲出来哒,但是运行仍存在问题,我需再努力下 ......
  • 2024.12.17
    根据您提供的代码和错误信息,问题在于您尝试将Page<Policy>对象直接传递给page方法,但是page方法期望的是一个实现了IPage接口的对象。Page类是IPage接口的一个实现,所以您可以直接使用Page类,但是需要确保您使用的是正确的类型参数。您的代码中出现的错误提示表明,您可......
  • 12.17
    1.C++ 程序只需要表现得好像语句是按照顺序执行的。C++ 编译器和计算机自身只要能够确保每次计算的含义都不会改变,就可以改变执行顺序使程序运行得更快。自 C++11 开始,C++ 不再认为只有一个执行地址。C++ 标准库现在支持启动和终止线程以及同步线程间的内存访问。在 C+......