首页 > 其他分享 >编译hadoop

编译hadoop

时间:2022-12-20 19:45:01浏览次数:51  
标签:src google protobuf cc am Makefile hadoop 编译

 

sudo vi /etc/profile

export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-mips64el
export CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar
export PATH=$JAVA_HOME/bin:$PATH

source /etc/profile

 

编译:

mvn clean package -DskipTests -Pdist,native

[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  07:52 min
[INFO] Finished at: 2022-12-20T18:05:41+08:00
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.hadoop:hadoop-maven-plugins:2.8.5:protoc (compile-protoc) on project hadoop-common: org.apache.maven.plugin.MojoExecutionException: 'protoc --version' did not return a version -> [Help 1]                                                                                                                                                                                                   
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-common

  

uos@uos-PC:~/wss/hadoop-rel-release-2.8.5$ cat BUILDING.txt

Build instructions for Hadoop

----------------------------------------------------------------------------------
Requirements:

* Unix System
* JDK 1.7+
* Maven 3.0 or later
* Findbugs 1.3.9 (if running findbugs)
* ProtocolBuffer 2.5.0
* CMake 2.6 or newer (if compiling native code), must be 3.0 or newer on Mac
* Zlib devel (if compiling native code)
* openssl devel (if compiling native hadoop-pipes and to get the best HDFS encryption performance)
* Linux FUSE (Filesystem in Userspace) version 2.6 or above (if compiling fuse_dfs)
* Internet connection for first build (to fetch all Maven and Hadoop dependencies)
* python (for releasedocs)

  

 

只有configure.ac,没有configure 运行autoconf命令来生成configure,但是报以下错误:

uos@uos-PC:~/wss/protobuf/protobuf-2.5.0$ autoconf
configure.ac:17: error: possibly undefined macro: AM_MAINTAINER_MODE
If this token and others are legitimate, please use m4_pattern_allow.
See the Autoconf documentation.
configure.ac:32: error: possibly undefined macro: AM_INIT_AUTOMAKE
configure.ac:49: error: possibly undefined macro: AM_CONDITIONAL
configure.ac:75: error: possibly undefined macro: AC_PROG_LIBTOOL
configure.ac:140: error: possibly undefined macro: AC_CXX_STL_HASH

运行这个命令解决以上的错误:autoreconf --install

src/Makefile.am:314: warning: source file 'google/protobuf/compiler/cpp/cpp_plugin_unittest.cc' is in a subdirectory,
src/Makefile.am:314: but option 'subdir-objects' is disabled
src/Makefile.am:314: warning: source file 'google/protobuf/compiler/java/java_plugin_unittest.cc' is in a subdirectory,
src/Makefile.am:314: but option 'subdir-objects' is disabled
src/Makefile.am:314: warning: source file 'google/protobuf/compiler/java/java_doc_comment_unittest.cc' is in a subdirectory,
src/Makefile.am:314: but option 'subdir-objects' is disabled
src/Makefile.am:314: warning: source file 'google/protobuf/compiler/python/python_plugin_unittest.cc' is in a subdirectory,
src/Makefile.am:314: but option 'subdir-objects' is disabled
src/Makefile.am:206: warning: source file 'google/protobuf/compiler/main.cc' is in a subdirectory,
src/Makefile.am:206: but option 'subdir-objects' is disabled
src/Makefile.am:380: warning: source file 'google/protobuf/compiler/mock_code_generator.cc' is in a subdirectory,
src/Makefile.am:380: but option 'subdir-objects' is disabled
src/Makefile.am:380: warning: source file 'google/protobuf/testing/file.cc' is in a subdirectory,
src/Makefile.am:380: but option 'subdir-objects' is disabled
src/Makefile.am:380: warning: source file 'google/protobuf/compiler/test_plugin.cc' is in a subdirectory,
src/Makefile.am:380: but option 'subdir-objects' is disabled
src/Makefile.am:391: warning: source file 'google/protobuf/testing/zcgunzip.cc' is in a subdirectory,
src/Makefile.am:391: but option 'subdir-objects' is disabled
src/Makefile.am:388: warning: source file 'google/protobuf/testing/zcgzip.cc' is in a subdirectory,
src/Makefile.am:388: but option 'subdir-objects' is disabled
src/Makefile.am: installing './depcomp'
parallel-tests: installing './test-driver'

./configure

checking for RCC... no
checking for xlC_r... no
checking for xlC... no
checking whether we are using the GNU C++ compiler... no
checking whether g++ accepts -g... no
checking dependency style of g++... none
checking how to run the C++ preprocessor... /lib/cpp
configure: error: in `/home/uos/wss/protobuf/protobuf-2.5.0':
configure: error: C++ preprocessor "/lib/cpp" fails sanity check
See `config.log' for more details

apt-get install build-essential

./configure

checking for the pthreads library -llthread... no
checking whether pthreads work with -pthread... yes
checking for joinable pthread attribute... PTHREAD_CREATE_JOINABLE
checking if more special flags are required for pthreads... no
checking whether to check for GCC pthread/shared inconsistencies... yes
checking whether -pthread is sufficient with -shared... yes
checking whether what we have so far is sufficient with -nostdlib... no
checking whether -lpthread saves the day... yes
checking the location of hash_map... <unordered_map>
checking that generated files are newer than configure... done
configure: creating ./config.status
config.status: creating Makefile
config.status: creating src/Makefile
config.status: creating protobuf.pc
config.status: creating protobuf-lite.pc
config.status: creating config.h
config.status: executing depfiles commands
config.status: executing libtool commands

  

make

Making all in src
make[2]: 进入目录“/home/uos/wss/protobuf/protobuf-2.5.0/src”
/bin/bash ../libtool  --tag=CXX   --mode=compile g++ -DHAVE_CONFIG_H -I. -I..    -pthread -Wall -Wwrite-strings -Woverloaded-virtual -Wno-sign-compare -O2 -g -DNDEBUG -MT common.lo -MD -MP -MF .deps/common.Tpo -c -o common.lo `test -f 'google/protobuf/stubs/common.cc' || echo './'`google/protobuf/stubs/common.cc
libtool: compile:  g++ -DHAVE_CONFIG_H -I. -I.. -pthread -Wall -Wwrite-strings -Woverloaded-virtual -Wno-sign-compare -O2 -g -DNDEBUG -MT common.lo -MD -MP -MF .deps/common.Tpo -c google/protobuf/stubs/common.cc  -fPIC -DPIC -o .libs/common.o
In file included from google/protobuf/stubs/common.cc:34:
./google/protobuf/stubs/once.h: In function ‘void google::protobuf::GoogleOnceInit(google::protobuf::ProtobufOnceType*, void (*)())’:
./google/protobuf/stubs/once.h:125:30: error: cannot convert ‘google::protobuf::ProtobufOnceType*’ {aka ‘long int*’} to ‘const volatile Atomic32*’ {aka ‘const volatile int*’}
   if (internal::Acquire_Load(once) != ONCE_STATE_DONE) {
                              ^~~~
In file included from ./google/protobuf/stubs/atomicops.h:184,
                 from ./google/protobuf/stubs/once.h:81,
                 from google/protobuf/stubs/common.cc:34:
./google/protobuf/stubs/atomicops_internals_mips_gcc.h:170:55: note:   initializing argument 1 of ‘google::protobuf::internal::Atomic32 google::protobuf::internal::Acquire_Load(const volatile Atomic32*)’
 inline Atomic32 Acquire_Load(volatile const Atomic32* ptr) {
                              ~~~~~~~~~~~~~~~~~~~~~~~~~^~~
In file included from google/protobuf/stubs/common.cc:34:
./google/protobuf/stubs/once.h: In function ‘void google::protobuf::GoogleOnceInit(google::protobuf::ProtobufOnceType*, void (*)(Arg*), Arg*)’:
./google/protobuf/stubs/once.h:134:30: error: cannot convert ‘google::protobuf::ProtobufOnceType*’ {aka ‘long int*’} to ‘const volatile Atomic32*’ {aka ‘const volatile int*’}
   if (internal::Acquire_Load(once) != ONCE_STATE_DONE) {
                              ^~~~
In file included from ./google/protobuf/stubs/atomicops.h:184,
                 from ./google/protobuf/stubs/once.h:81,
                 from google/protobuf/stubs/common.cc:34:
./google/protobuf/stubs/atomicops_internals_mips_gcc.h:170:55: note:   initializing argument 1 of ‘google::protobuf::internal::Atomic32 google::protobuf::internal::Acquire_Load(const volatile Atomic32*)’
 inline Atomic32 Acquire_Load(volatile const Atomic32* ptr) {
                              ~~~~~~~~~~~~~~~~~~~~~~~~~^~~
make[2]: *** [Makefile:1619:common.lo] 错误 1
make[2]: 离开目录“/home/uos/wss/protobuf/protobuf-2.5.0/src”
make[1]: *** [Makefile:669:all-recursive] 错误 1
make[1]: 离开目录“/home/uos/wss/protobuf/protobuf-2.5.0”
make: *** [Makefile:576:all] 错误 2

  

 

标签:src,google,protobuf,cc,am,Makefile,hadoop,编译
From: https://www.cnblogs.com/linuxws/p/16994950.html

相关文章