首页 > 其他分享 >hdu 1241(dfs)

hdu 1241(dfs)

时间:2023-06-28 19:31:50浏览次数:42  
标签:map hdu oil int dfs 1241 grid y0 x0



Problem Description


The GeoSurvComp geologic survey company is responsible for detecting underground oil deposits. GeoSurvComp works with one large rectangular region of land at a time, and creates a grid that divides the land into numerous square plots. It then analyzes each plot separately, using sensing equipment to determine whether or not the plot contains oil. A plot containing oil is called a pocket. If two pockets are adjacent, then they are part of the same oil deposit. Oil deposits can be quite large and may contain numerous pockets. Your job is to determine how many different oil deposits are contained in a grid. 




Input

The input file contains one or more grids. Each grid begins with a line containing m and n, the number of rows and columns in the grid, separated by a single space. If m = 0 it signals the end of the input; otherwise 1 <= m <= 100 and 1 <= n <= 100. Following this are m lines of n characters each (not counting the end-of-line characters). Each character corresponds to one plot, and is either `*', representing the absence of oil, or `@', representing an oil pocket.



Output


For each grid, output the number of distinct oil deposits. Two different pockets are part of the same oil deposit if they are adjacent horizontally, vertically, or diagonally. An oil deposit will not contain more than 100 pockets.



Sample Input

1 1

*

3 5

*@*@*

**@**

*@*@*

1 8

@@****@*

5 5

****@

*@@*@

*@**@

@@@*@

@@**@


0 0


Sample Output

0

1

2

2




题意:相邻的@作为一片区域,要输出有多少片区域,用的DFS水过。。需要提醒的是HDU给的测试数据的5 5后面多了一个空格,自己测试时请自行删去。。


#include <stdio.h>
#include <string.h>
int flag[3]={-1,0,1};
char vie[105][105];
int map[105][105];
int m,n,ans;
void DFS(int x,int y)
{
    map[x][y]=1;
    int x0,y0;
    for(int i=0;i<3;i++)
        for(int j=0;j<3;j++)
        {
            x0=x+flag[i];
            y0=y+flag[j];
            if(x0<0||x0>=m||y0<0||y0>=n)
                continue;
            else
                if(vie[x0][y0]=='@'&&map[x0][y0]==0)
                {
                    map[x0][y0]=1;
                    DFS(x0,y0);
                }
        }
}
int main()
{
    int i,j;
    while(scanf("%d%d",&m,&n)!=EOF&&m!=0&&n!=0)
    {
        memset(vie,0,sizeof(vie));
        memset(map,0,sizeof(map));
        ans=0;
        for(i=0;i<m;i++)
            scanf("%s",vie[i]);
        for(i=0;i<m;i++)
            for(j=0;j<n;j++)
                if(vie[i][j]=='@'&&map[i][j]==0)
                {
                    DFS(i,j);
                    ans++;
                }
        printf("%d\n",ans);
    }
	return 0;
}





标签:map,hdu,oil,int,dfs,1241,grid,y0,x0
From: https://blog.51cto.com/u_10729926/6575716

相关文章

  • 客户端在HDFS上读、写数据的流程
      ......
  • 免费修复一加手机高通崩溃qualcomm crashdump mode
    qualcommcrashdumpmodequalcommcrashdumpmodequalcommcrashdumpmode高通崩溃高通崩溃高通崩溃希望崩溃的小朋友们,送修之前能搜到。。线刷下载,挨个刷。。国内找个网站比较恶心,下载要要两块钱。。这个免费。。。https://onepluscommunityserver.com/list/Unbrick_Too......
  • 算法——DFS、BFS、记忆回溯、记忆搜索
    回溯和深度优先搜索的区别回溯是一种更通用的算法。可以用于任何类型的结构,其中可以消除域的部分——无论它是否是逻辑树。深度优先搜索是与搜索树或图结构相关的特定回溯形式。它使用回溯作为其使用树的方法的一部分,但仅限于树/图结构。回溯和DFS之间的区别在于回溯处理隐......
  • Hadoop中HDFS集群启停命令
    一键启停脚本#一键启动hdfs集群start-dfs.sh#一键关闭hdfs集群stop-dfs.sh单进程启停$HADOOP_HOME/sbin/hadoop-daemon.sh,此脚本可以单独控制所在机器的进程的启停用法:hadoop-daemon.sh(start|status|stop)(namenode|secondarynamenode|datanode) $HADOOP_HOME/bin......
  • HDFS集群环境部署
    第一步,上传Hadoop安装包到node1节点。输入Linux命令:ll查看是否下载成功。 第二步:然后就行解压:解压语句:tar-zxvfhadoop-3.3.4.tar.gz-C/export/server第三步:构建软连接:cd/export/serverin-s/export/server/hadoop-3.3.4hadoop ......
  • 图的遍历——DFS, BFS(邻接矩阵,邻接表)——C语言描述
    图的遍历——DFS,BFS(邻接矩阵,邻接表)——C语言描述目录图的遍历——DFS,BFS(邻接矩阵,邻接表)——C语言描述0测试用例框架1图的深度优先遍历(DFS)1.1邻接矩阵(1)数据结构(2)代码(3)测试用例(4)打印结果1.2邻接表(1)数据结构(2)代码(3)测试用例(4)结果2图的广度度优先遍历(BFS)2.1队列(1)数据结构......
  • HDFS数据读写过程
    读数据的全过程写数据的全过程:......
  • HDFS存储原理
    冗余数据保存问题:一个数据块默认被保存三次好处:1.加快数据传输错误(假如要同时访问数据块1因为他冗余存储就会有3份所以会加快数据传输速度)2.很容易检查数据错误3.保证数据可靠性数据的错误与恢复......
  • HDFS体系结构
    命名空间:目录 文件 块局限性......
  • 【ETL工具将数据源抽取到HDFS作为高可靠、高吞吐量的分布式文件系统存储】
    ETL工具的安装与配置常见的ETL工具包括ApacheNifi、Talend、Informatica、Datastage等。不论使用哪个工具,将数据源抽取到HDFS作为高可靠、高吞吐量的分布式文件系统存储是ETL工具的一项基本功能。基于Talend工具):1.下载Talend工具安装包在Talend官网上下载适合自己的TalendOp......