windows环境hadoop问题
来源:4-8 词频统计之本地方式运行

bking3629688
2021-02-22
Exception in thread “main” java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode0(Ljava/lang/String;I)V
已经按之前的回答,把winutils.exe和hadoop.dll 添加到HADOOP_HOME跟SYSTEM32下了。环境变量也配了,仍然报这个错
DEBUG [main] - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, value=[Rate of successful kerberos logins and latency (milliseconds)], valueName=Time)
DEBUG [main] - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, value=[Rate of failed kerberos logins and latency (milliseconds)], valueName=Time)
DEBUG [main] - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, value=[GetGroups], valueName=Time)
DEBUG [main] - field private org.apache.hadoop.metrics2.lib.MutableGaugeLong org.apache.hadoop.security.UserGroupInformation$UgiMetrics.renewalFailuresTotal with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, value=[Renewal failures since startup], valueName=Time)
DEBUG [main] - field private org.apache.hadoop.metrics2.lib.MutableGaugeInt org.apache.hadoop.security.UserGroupInformation$UgiMetrics.renewalFailures with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, value=[Renewal failures since last successful login], valueName=Time)
DEBUG [main] - UgiMetrics, User and group related metrics
DEBUG [main] - Kerberos krb5 configuration not found, setting default realm to empty
DEBUG [main] - PrivilegedAction as:hadoop1 (auth:SIMPLE) from:org.apache.hadoop.fs.FileSystem.get(FileSystem.java:173)
DEBUG [main] - sampler.classes = ; loaded no samplers
DEBUG [main] - span.receiver.classes = ; loaded no span receivers
DEBUG [main] - dfs.client.use.legacy.blockreader.local = false
DEBUG [main] - dfs.client.read.shortcircuit = false
DEBUG [main] - dfs.client.domain.socket.data.traffic = false
DEBUG [main] - dfs.domain.socket.path =
DEBUG [main] - Sets dfs.client.block.write.replace-datanode-on-failure.min-replication to 0
DEBUG [main] - Setting hadoop.security.token.service.use_ip to true
DEBUG [main] - multipleLinearRandomRetry = null
DEBUG [main] - Creating new Groups object
DEBUG [main] - Trying to load the custom-built native-hadoop library...
DEBUG [main] - Loaded the native-hadoop library
DEBUG [main] - Using JniBasedUnixGroupsMapping for Group resolution
DEBUG [main] - Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMapping
DEBUG [main] - Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000
DEBUG [main] - rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=class org.apache.hadoop.ipc.ProtobufRpcEngine$RpcRequestWrapper, rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@687e99d8
DEBUG [main] - getting client out of cache: org.apache.hadoop.ipc.Client@623a8092
DEBUG [main] - Both short-circuit local reads and UNIX domain socket are disabled.
DEBUG [main] - DataTransferProtocol not using SaslPropertiesResolver, no QOP found in configuration for dfs.data.transfer.protection
DEBUG [main] - hadoop login
DEBUG [main] - hadoop login commit
DEBUG [main] - using local user:NTUserPrincipal: a
DEBUG [main] - Using user: "NTUserPrincipal: a" with name a
DEBUG [main] - User entry: "a"
DEBUG [main] - UGI loginUser:a (auth:SIMPLE)
DEBUG [main] - PrivilegedAction as:a (auth:SIMPLE) from:org.apache.hadoop.mapreduce.Job.connect(Job.java:1272)
DEBUG [main] - Trying ClientProtocolProvider : org.apache.hadoop.mapred.LocalClientProtocolProvider
INFO [main] - session.id is deprecated. Instead, use dfs.metrics.session-id
INFO [main] - Initializing JVM Metrics with processName=JobTracker, sessionId=
DEBUG [main] - Picked org.apache.hadoop.mapred.LocalClientProtocolProvider as the ClientProtocolProvider
DEBUG [main] - PrivilegedAction as:a (auth:SIMPLE) from:org.apache.hadoop.mapreduce.Cluster.getFileSystem(Cluster.java:184)
DEBUG [main] - PrivilegedAction as:a (auth:SIMPLE) from:org.apache.hadoop.mapreduce.Job.submit(Job.java:1304)
DEBUG [Thread-2] - stopping client from cache: org.apache.hadoop.ipc.Client@623a8092
DEBUG [Thread-2] - removing client from cache: org.apache.hadoop.ipc.Client@623a8092
DEBUG [Thread-2] - stopping actual client because no more references remain: org.apache.hadoop.ipc.Client@623a8092
DEBUG [Thread-2] - Stopping client
Exception in thread "main" java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode0(Ljava/lang/String;I)V
at org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode0(Native Method)
at org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode(NativeIO.java:524)
at org.apache.hadoop.fs.RawLocalFileSystem.mkOneDirWithMode(RawLocalFileSystem.java:465)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirsWithOptionalPermission(RawLocalFileSystem.java:518)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:496)
at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:316)
at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:133)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:148)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1307)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1304)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1924)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1304)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1325)
at com.bigdata.hadoop.mr.wc.WordCountApp.main(WordCountApp.java:68)```
写回答
1回答
-
bking3629688
提问者
2021-02-22
问题解决了,原生的hadoop2.6.0编译的hadoop.dll好像不适用cdh版本。在网上找了个大神自己编译的替换了,跑出结果了,太感人了
122022-02-23
相似问题