首页 > 其他分享 >29

29

时间:2025-01-06 23:33:37浏览次数:1  
标签:String 29 SC static put hbase public

实验3

熟悉常用的HBase操作

 

 

1.实验目的

(1)理解HBase在Hadoop体系结构中的角色;

(2)熟练使用HBase操作常用的Shell命令;

(3)熟悉HBase操作常用的Java API。

2.实验平台

(1)操作系统:Linux(建议Ubuntu16.04或Ubuntu18.04);

(2)Hadoop版本:3.1.3;

(3)HBase版本:2.2.2;

(4)JDK版本:1.8;

(5)Java IDE:Eclipse。

3. 实验步骤

(一)编程实现以下指定功能,并用Hadoop提供的HBase Shell命令完成相同任务:

(1) 列出HBase所有的表的相关信息,例如表名;

 

(2) 在终端打印出指定的表的所有记录数据;

 

(3) 向已经创建好的表添加和删除指定的列族或列;

添加

put 'Student','s000','S_Name','Huangkaiyu'

 

删除

delete 'Student','s000','S_Name'

 

(4) 清空指定的表的所有记录数据;

truncate 'Student'

(5) 统计表的行数。

count 'Student'

 

 

(二)HBase数据库操作

1. 现有以下关系型数据库中的表和数据(见表14-3到表14-5),要求将其转换为适合于HBase存储的表并插入数据:

表14-3 学生表(Student)

学号(S_No)

姓名(S_Name)

性别(S_Sex)

年龄(S_Age)

2015001

Zhangsan

male

23

2015002

Mary

female

22

2015003

Lisi

male

24

create 'Student','S_No','S_Name','S_Sex','S_Age'

put 'Student','s001','S_No','2015001'

put 'Student','s001','S_Name','Zhangsan'

put 'Student','s001','S_Sex','male'

put 'Student','s001','S_Age','23'

put 'Student','s002','S_No','2015002'

put 'Student','s002','S_Name','Mary'

put 'Student','s002','S_Sex','female'

put 'Student','s002','S_Age','22'

put 'Student','s003','S_No','2015003'

put 'Student','s003','S_Name','Lisi'

put 'Student','s003','S_Sex','male'

put 'Student','s003','S_Age','24'

 

 

表14-4 课程表(Course)

课程号(C_No)

课程名(C_Name)

学分(C_Credit)

123001

Math

2.0

123002

Computer Science

5.0

123003

English

3.0

 create 'Course','C_No','C_Name','C_Credit'

put 'Course','c001','C_No','123001'

put 'Course','c001','C_Name','Math'

put 'Course','c001','C_Credit','2.0'

put 'Course','c002','C_No','123002'

put 'Course','c002','C_Name','Computer'

put 'Course','c002','C_Credit','5.0'

put 'Course','c003','C_No','123003'

put 'Course','c003','C_Name','English'

put 'Course','c003','C_Credit','3.0'

 

 

 

表14-5 选课表(SC)

学号(SC_Sno)

课程号(SC_Cno)

成绩(SC_Score)

2015001

123001

86

2015001

123003

69

2015002

123002

77

2015002

123003

99

2015003

123001

98

2015003

123002

95

 put 'SC','sc001','SC_Sno','2015001'

put 'SC','sc001','SC_Cno','123001'

put 'SC','sc001','SC_Score','86'

put 'SC','sc002','SC_Sno','2015001'

put 'SC','sc002','SC_Cno','123003'

put 'SC','sc002','SC_Score','69'

put 'SC','sc003','SC_Sno','2015002'

put 'SC','sc003','SC_Cno','123002'

put 'SC','sc003','SC_Score','77'

put 'SC','sc004','SC_Sno','2015002'

put 'SC','sc004','SC_Cno','123003'

put 'SC','sc004','SC_Score','99'

put 'SC','sc005','SC_Sno','2015003'

put 'SC','sc005','SC_Cno','123001'

put 'SC','sc005','SC_Score','98'

put 'SC','sc006','SC_Sno','2015003'

put 'SC','sc006','SC_Cno','123002'

put 'SC','sc006','SC_Score','95'

 

 

 

2. 请编程实现以下功能:

(1)createTable(String tableName, String[] fields)

创建表,参数tableName为表的名称,字符串数组fields为存储记录各个字段名称的数组。要求当HBase已经存在名为tableName的表的时候,先删除原有的表,然后再创建新的表。

 

import org.apache.hadoop.conf.Configuration;

import org.apache.hadoop.hbase.HBaseConfiguration;

import org.apache.hadoop.hbase.HColumnDescriptor;

import org.apache.hadoop.hbase.HTableDescriptor;

import org.apache.hadoop.hbase.TableName;

import org.apache.hadoop.hbase.client.Admin;

import org.apache.hadoop.hbase.client.Connection;

import org.apache.hadoop.hbase.client.ConnectionFactory;

 

import java.io.IOException;

 

public class CreateTable {

    public static Configuration configuration;

    public static Connection connection;

    public static Admin admin;

 

    public static void createTable(String tableName, String[] fields) throws IOException {

        init();

        TableName tablename = TableName.valueOf(tableName);

        if (admin.tableExists(tablename)) {

            System.out.println("table is exists!");

            admin.disableTable(tablename);

            admin.deleteTable(tablename);

        }

        HTableDescriptor hTableDescriptor = new HTableDescriptor(tablename);

        for (String str : fields) {

            HColumnDescriptor hColumnDescriptor = new HColumnDescriptor(str);

            hTableDescriptor.addFamily(hColumnDescriptor);

        }

        admin.createTable(hTableDescriptor);

        close();

    }

 

    public static void init() {

        configuration = HBaseConfiguration.create();

        configuration.set("hbase.rootdir", "hdfs://localhost:9000/hbase");

        try {

            connection = ConnectionFactory.createConnection(configuration);

            admin = connection.getAdmin();

        } catch (IOException e) {

            e.printStackTrace();

        }

    }

 

    public static void close() {

        try {

            if (admin != null) {

                admin.close();

            }

            if (null != connection) {

                connection.close();

            }

        } catch (IOException e) {

            e.printStackTrace();

        }

    }

 

    public static void main(String[] args) {

        String[] fields = {"Score"};

        try {

            createTable("person", fields);

        } catch (IOException e) {

            e.printStackTrace();

        }

    }

}

 

 

(2)addRecord(String tableName, String row, String[] fields, String[] values)

向表tableName、行row(用S_Name表示)和字符串数组fields指定的单元格中添加对应的数据values。其中,fields中每个元素如果对应的列族下还有相应的列限定符的话,用“columnFamily:column”表示。例如,同时向“Math”、“Computer Science”、“English”三列添加成绩时,字符串数组fields为{“Score:Math”, ”Score:Computer Science”, ”Score:English”},数组values存储这三门课的成绩。

 

import org.apache.hadoop.conf.Configuration;

import org.apache.hadoop.hbase.HBaseConfiguration;

import org.apache.hadoop.hbase.TableName;

import org.apache.hadoop.hbase.client.*;

 

import java.io.IOException;

 

public class AddRecord {

    public static Configuration configuration;

    public static Connection connection;

    public static Admin admin;

 

    public static void addRecord(String tableName, String row, String[] fields, String[] values) throws IOException {

        init();

        Table table = connection.getTable(TableName.valueOf(tableName));

        for (int i = 0; i != fields.length; i++) {

            Put put = new Put(row.getBytes());

            String[] cols = fields[i].split(":");

            put.addColumn(cols[0].getBytes(), cols[1].getBytes(), values[i].getBytes());

            table.put(put);

        }

        table.close();

        close();

    }

 

    public static void init() {

        configuration = HBaseConfiguration.create();

        configuration.set("hbase.rootdir", "hdfs://localhost:9000/hbase");

        try {

            connection = ConnectionFactory.createConnection(configuration);

            admin = connection.getAdmin();

        } catch (IOException e) {

            e.printStackTrace();

        }

    }

 

    public static void close() {

        try {

            if (admin != null) {

                admin.close();

            }

            if (null != connection) {

                connection.close();

            }

        } catch (IOException e) {

            e.printStackTrace();

        }

    }

 

    public static void main(String[] args) {

        String[] fields = {"Score:Math", "Score:Computer Science", "Score:English"};

        String[] values = {"99", "80", "100"};

        try {

            addRecord("person", "Score", fields, values);

        } catch (IOException e) {

            e.printStackTrace();

        }

 

    }

}

 

 

 

(3)scanColumn(String tableName, String column)

浏览表tableName某一列的数据,如果某一行记录中该列数据不存在,则返回null。要求当参数column为某一列族名称时,如果底下有若干个列限定符,则要列出每个列限定符代表的列的数据;当参数column为某一列具体名称(例如“Score:Math”)时,只需要列出该列的数据。

 

import org.apache.hadoop.conf.Configuration;

import org.apache.hadoop.hbase.Cell;

import org.apache.hadoop.hbase.CellUtil;

import org.apache.hadoop.hbase.HBaseConfiguration;

import org.apache.hadoop.hbase.TableName;

import org.apache.hadoop.hbase.client.*;

import org.apache.hadoop.hbase.util.Bytes;

 

import java.io.IOException;

 

public class ScanColumn {

    public static Configuration configuration;

    public static Connection connection;

    public static Admin admin;

 

    public static void scanColumn(String tableName, String column) throws IOException {

        init();

        Table table = connection.getTable(TableName.valueOf(tableName));

        Scan scan = new Scan();

        scan.addFamily(Bytes.toBytes(column));

        ResultScanner scanner = table.getScanner(scan);

        for (Result result = scanner.next(); result != null; result = scanner.next()) {

            showCell(result);

        }

        table.close();

        close();

    }

 

    public static void showCell(Result result) {

        Cell[] cells = result.rawCells();

        for (Cell cell : cells) {

            System.out.println("RowName:" + new String(CellUtil.cloneRow(cell)) + " ");

            System.out.println("Timetamp:" + cell.getTimestamp() + " ");

            System.out.println("column Family:" + new String(CellUtil.cloneFamily(cell)) + " ");

            System.out.println("row Name:" + new String(CellUtil.cloneQualifier(cell)) + " ");

            System.out.println("value:" + new String(CellUtil.cloneValue(cell)) + " ");

        }

    }

 

    public static void init() {

        configuration = HBaseConfiguration.create();

        configuration.set("hbase.rootdir", "hdfs://localhost:9000/hbase");

        try {

            connection = ConnectionFactory.createConnection(configuration);

            admin = connection.getAdmin();

        } catch (IOException e) {

            e.printStackTrace();

        }

    }

 

    // 关闭连接

    public static void close() {

        try {

            if (admin != null) {

                admin.close();

            }

            if (null != connection) {

                connection.close();

            }

        } catch (IOException e) {

            e.printStackTrace();

        }

    }

 

    public static void main(String[] args) {

        try {

            scanColumn("person", "Score");

        } catch (IOException e) {

            e.printStackTrace();

        }

 

    }

}

 

 

 

(4)modifyData(String tableName, String row, String column)

修改表tableName,行row(可以用学生姓名S_Name表示),列column指定的单元格的数据。

 

import org.apache.hadoop.conf.Configuration;

import org.apache.hadoop.hbase.Cell;

import org.apache.hadoop.hbase.HBaseConfiguration;

import org.apache.hadoop.hbase.TableName;

import org.apache.hadoop.hbase.client.*;

 

import java.io.IOException;

 

public class ModifyData {

 

    public static long ts;

    public static Configuration configuration;

    public static Connection connection;

    public static Admin admin;

 

    public static void modifyData(String tableName, String row, String column, String val) throws IOException {

        init();

        Table table = connection.getTable(TableName.valueOf(tableName));

        Put put = new Put(row.getBytes());

        Scan scan = new Scan();

        ResultScanner resultScanner = table.getScanner(scan);

        for (Result r : resultScanner) {

            for (Cell cell : r.getColumnCells(row.getBytes(), column.getBytes())) {

                ts = cell.getTimestamp();

            }

        }

        put.addColumn(row.getBytes(), column.getBytes(), ts, val.getBytes());

        table.put(put);

        table.close();

        close();

    }

 

    public static void init() {

        configuration = HBaseConfiguration.create();

        configuration.set("hbase.rootdir", "hdfs://localhost:9000/hbase");

        try {

            connection = ConnectionFactory.createConnection(configuration);

            admin = connection.getAdmin();

        } catch (IOException e) {

            e.printStackTrace();

        }

    }

 

    public static void close() {

        try {

            if (admin != null) {

                admin.close();

            }

            if (null != connection) {

                connection.close();

            }

        } catch (IOException e) {

            e.printStackTrace();

        }

    }

 

    public static void main(String[] args) {

        try {

            modifyData("person", "Score", "Math", "100");

        } catch (IOException e) {

            e.printStackTrace();

        }

 

    }

}

 

 

 

(5)deleteRow(String tableName, String row)

删除表tableName中row指定的行的记录。

 

import org.apache.hadoop.conf.Configuration;

import org.apache.hadoop.hbase.Cell;

import org.apache.hadoop.hbase.HBaseConfiguration;

import org.apache.hadoop.hbase.TableName;

import org.apache.hadoop.hbase.client.*;

 

import java.io.IOException;

 

public class ModifyData {

 

    public static long ts;

    public static Configuration configuration;

    public static Connection connection;

    public static Admin admin;

 

    public static void modifyData(String tableName, String row, String column, String val) throws IOException {

        init();

        Table table = connection.getTable(TableName.valueOf(tableName));

        Put put = new Put(row.getBytes());

        Scan scan = new Scan();

        ResultScanner resultScanner = table.getScanner(scan);

        for (Result r : resultScanner) {

            for (Cell cell : r.getColumnCells(row.getBytes(), column.getBytes())) {

                ts = cell.getTimestamp();

            }

        }

        put.addColumn(row.getBytes(), column.getBytes(), ts, val.getBytes());

        table.put(put);

        table.close();

        close();

    }

 

    public static void init() {

        configuration = HBaseConfiguration.create();

        configuration.set("hbase.rootdir", "hdfs://localhost:9000/hbase");

        try {

            connection = ConnectionFactory.createConnection(configuration);

            admin = connection.getAdmin();

        } catch (IOException e) {

            e.printStackTrace();

        }

    }

 

    public static void close() {

        try {

            if (admin != null) {

                admin.close();

            }

            if (null != connection) {

                connection.close();

            }

        } catch (IOException e) {

            e.printStackTrace();

        }

    }

 

    public static void main(String[] args) {

        try {

            modifyData("person", "Score", "Math", "100");

        } catch (IOException e) {

            e.printStackTrace();

        }

 

    }

}

标签:String,29,SC,static,put,hbase,public
From: https://www.cnblogs.com/wanbeibei/p/18656530

相关文章

  • 1929-2024年全球气象站点逐日气象指标数据(气温、降水量、风速等12项)
    全球气象站点的逐日12项气象指标数据,数据时间为1929年-2024年,地理坐标系是WGS-84。本数据的气象指标包括:平均温度、平均露点、平均海平面压力、平均观测站压力、平均能见度、平均风速、最大持续风速、最高气温、最低气温、降水量、积雪深度、气象情况指示。该数据集提供了1......
  • STM32烧写失败之Contents mismatch at: 0800005CH (Flash=FFH Required=29H) !
    一)问题:用ULINK2给STM32F103C8T6下载程序,下载方式设置如下:出现下面两个问题:1)下载问题界面如下:这个错误的信息大概可以理解为,在0x08000063地址上读取到flash存储为FF,但实际上应该写入08H,即校验时读取到数据与实际写入的不符。2)在DEBUG调试的时候,出现如下问题:调试的时候......
  • P4229 某位歌姬的故事 题解
    题目描述\(T\)组数据,求有多少个长为\(n\)的数组\(h\)满足\(1\leh_i\lea\)和以下\(q\)条限制:\[\max_{l_i\lej\leh_i}h_j=w_i\]对\(998244353\)取模。数据范围\(1\leT\le20\)。\(1\len,a\le9\cdot10^8,1\leq\le500\)。\(1\lel_i\ler_i\len,......
  • (免费源码)计算机毕业设计必学必看 万套实战教程 java、python、php、node.js、c#、APP
    摘 要本论文主要论述了如何使用SSM框架开发一个网络课程系统,将严格按照软件开发流程进行各个阶段的工作,采用B/S架构Java技术,面向对象编程思想进行项目开发。在引言中,将论述网络课程系统的当前背景以及系统开发的目的,后续章节将严格按照软件开发流程,对系统进行各个阶段分析......
  • 「Mac畅玩鸿蒙与硬件52」UI互动应用篇29 - 模拟火车票查询系统
    本篇教程将实现一个模拟火车票查询系统,通过输入条件筛选车次信息,并展示动态筛选结果,学习事件处理、状态管理和界面展示的综合开发技巧。关键词条件筛选动态数据展示状态管理UI交互查询系统一、功能说明模拟火车票查询系统包含以下功能:用户输入查询条件:支持输入出发......
  • 2024.12.29
    《代码大全2》是一本极具价值的编程书籍,它为软件开发人员提供了丰富的知识和经验。阅读完这本书,我深受启发。书中对代码的各种方面进行了深入的探讨,从代码的设计到具体的实现,都给出了详细的指导。它强调了代码的可读性和可维护性,这让我明白写代码不仅仅是实现功能,更要注重代码的......
  • 9.29
    软件设计                 石家庄铁道大学信息学院 实验8:适配器模式本次实验属于模仿型实验,通过本次实验学生将掌握以下内容:1、理解适配器模式的动机,掌握该模式的结构;2、能够利用适配器模式解决实际问题。    [实验任务一]:双向适配器......
  • Apache Solr 远程命令执行漏洞复现(CVE-2017-12629)
    一、漏洞介绍        ApacheSolr是一个开源的搜索服务器。Solr使用Java语言开发,主要基于HTTP和ApacheLucene实现。原理大致是文档通过Http利用XML加到一个搜索集合中。查询该集合也是通过http收到一个XML/JSON响应来实现。此次7.1.0之前版本总共爆出两个漏......
  • RVG29靶向肽
    序列:YTIWMPENPRPGTPCDIFTNSRGKRASNG结构:氨基酸组成:RVG29是一个由29个氨基酸残基组成的肽链。其氨基酸序列是:Arg-Val-Gly-Ser-Gln-Thr-Pro-Val-Val-Arg-Gly-Gly-Arg-Gln-Arg-Gln-Ile-Arg-Cys-Cys-Phe-Leu-Ser-Arg-Lys-Asn-Gly-Gly-Arg(简称RVG29)。分子式和分子量:RVG29的分子式......
  • C#/.NET/.NET Core技术前沿周刊 | 第 19 期(2024年12.23-12.29)
    前言C#/.NET/.NETCore技术前沿周刊,你的每周技术指南针!记录、追踪C#/.NET/.NETCore领域、生态的每周最新、最实用、最有价值的技术文章、社区动态、优质项目和学习资源等。让你时刻站在技术前沿,助力技术成长与视野拓宽。欢迎投稿、推荐或自荐优质文章、项目、学习资源等。......