在dinky中使用flink和paimon报错,提示:NoSuchMethodError: 'org.apache.htrace.core.Tracer怎么办?
在dinky中使用flink和paimon报错,结果提示报错了:
Caused by: java.lang.NoSuchMethodError: 'org.apache.htrace.core.Tracer org.apache.hadoop.fs.FsTracer.get(org.apache.hadoop.conf.Configuration)'
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:307) ~[flink-shaded-hadoop-3-3.1.1.7.2.9.0-173-9.0.jar:3.1.1.7.2.9.0-173-9.0]
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:292) ~[flink-shaded-hadoop-3-3.1.1.7.2.9.0-173-9.0.jar:3.1.1.7.2.9.0-173-9.0]
at org.apache.hadoop.hdfs.DistributedFileSystem.initDFSClient(DistributedFileSystem.java:200) ~[flink-shaded-hadoop-3-3.1.1.7.2.9.0-173-9.0.jar:3.1.1.7.2.9.0-173-9.0]
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:185) ~[flink-shaded-hadoop-3-3.1.1.7.2.9.0-173-9.0.jar:3.1.1.7.2.9.0-173-9.0]
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3469) ~[flink-azure-fs-hadoop-1.20.0.jar:1.20.0]
at org.apache.hadoop.fs.FileSystem.access0(FileSystem.java:174) ~[flink-azure-fs-hadoop-1.20.0.jar:1.20.0]
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3574) ~[flink-azure-fs-hadoop-1.20.0.jar:1.20.0]
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3521) ~[flink-azure-fs-hadoop-1.20.0.jar:1.20.0]
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:540) ~[flink-azure-fs-hadoop-1.20.0.jar:1.20.0]
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:365) ~[flink-azure-fs-hadoop-1.20.0.jar:1.20.0]
at org.apache.paimon.fs.hadoop.HadoopFileIO.createFileSystem(HadoopFileIO.java:175) ~[paimon-flink-common-0.9.0.jar:0.9.0]
at org.apache.paimon.fs.hadoop.HadoopFileIO.getFileSystem(HadoopFileIO.java:168) ~[paimon-flink-common-0.9.0.jar:0.9.0]
at org.apache.paimon.fs.hadoop.HadoopFileIO.getFileSystem(HadoopFileIO.java:145) ~[paimon-flink-common-0.9.0.jar:0.9.0]
at org.apache.paimon.fs.hadoop.HadoopFileIO.exists(HadoopFileIO.java:110) ~[paimon-flink-common-0.9.0.jar:0.9.0]
at org.apache.paimon.fs.FileIO.checkOrMkdirs(FileIO.java:208) ~[paimon-flink-common-0.9.0.jar:0.9.0]
at org.apache.paimon.catalog.CatalogFactory.createUnwrappedCatalog(CatalogFactory.java:98) ~[paimon-flink-common-0.9.0.jar:0.9.0]
at org.apache.paimon.catalog.CatalogFactory.createCatalog(CatalogFactory.java:71) ~[paimon-flink-common-0.9.0.jar:0.9.0]
at org.apache.paimon.flink.FlinkCatalogFactory.createCatalog(FlinkCatalogFactory.java:63) ~[paimon-flink-common-0.9.0.jar:0.9.0]
at org.apache.paimon.flink.FlinkCatalogFactory.createCatalog(FlinkCatalogFactory.java:53) ~[paimon-flink-common-0.9.0.jar:0.9.0]
at org.apache.paimon.flink.FlinkCatalogFactory.createCatalog(FlinkCatalogFactory.java:32) ~[paimon-flink-common-0.9.0.jar:0.9.0]
at org.apache.flink.table.factories.FactoryUtil.createCatalog(FactoryUtil.java:514) ~[flink-table-api-java-uber-1.20.0.jar:1.20.0]
at org.apache.flink.table.catalog.CatalogManager.initCatalog(CatalogManager.java:368) ~[flink-table-api-java-uber-1.20.0.jar:1.20.0]
at org.apache.flink.table.catalog.CatalogManager.createCatalog(CatalogManager.java:322) ~[flink-table-api-java-uber-1.20.0.jar:1.20.0]
at org.apache.flink.table.operations.ddl.CreateCatalogOperation.execute(CreateCatalogOperation.java:88) ~[flink-table-api-java-uber-1.20.0.jar:1.20.0]
at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:1102) ~[flink-table-api-java-uber-1.20.0.jar:1.20.0]
at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeSql(TableEnvironmentImpl.java:735) ~[flink-table-api-java-uber-1.20.0.jar:1.20.0]
at org.dinky.executor.DefaultTableEnvironment.executeSql(DefaultTableEnvironment.java:300) ~[dinky-client-1.20-1.2.0-rc5.jar:?]
at org.dinky.executor.Executor.executeSql(Executor.java:263) ~[dinky-core-1.2.0-rc5.jar:?]
at org.dinky.job.runner.JobDDLRunner.run(JobDDLRunner.java:84) ~[dinky-core-1.2.0-rc5.jar:?]
at org.dinky.job.JobManager.executeSql(JobManager.java:292) ~[dinky-core-1.2.0-rc5.jar:?]
at org.dinky.service.task.FlinkSqlTask.execute(FlinkSqlTask.java:70) ~[dinky-admin-1.2.0-rc5.jar:?]
at org.dinky.service.impl.TaskServiceImpl.executeJob(TaskServiceImpl.java:211) ~[dinky-admin-1.2.0-rc5.jar:?]
at org.dinky.service.impl.TaskServiceImpl.executeJob(TaskServiceImpl.java:202) ~[dinky-admin-1.2.0-rc5.jar:?]
at org.dinky.service.impl.TaskServiceImpl$$FastClassBySpringCGLIB$087f7c.invoke(<generated>) ~[dinky-admin-1.2.0-rc5.jar:?]
at org.springframework.cglib.proxy.MethodProxy.invoke(MethodProxy.java:218) ~[spring-core-5.3.31.jar:5.3.31]
at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.invokeJoinpoint(CglibAopProxy.java:792) ~[spring-aop-5.3.31.jar:5.3.31]
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:163) ~[spring-aop-5.3.31.jar:5.3.31]
at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.proceed(CglibAopProxy.java:762) ~[spring-aop-5.3.31.jar:5.3.31]
at org.springframework.aop.aspectj.MethodInvocationProceedingJoinPoint.proceed(MethodInvocationProceedingJoinPoint.java:89) ~[spring-aop-5.3.31.jar:5.3.31]
at org.dinky.aop.ProcessAspect.processStepAround(ProcessAspect.java:110) ~[dinky-admin-1.2.0-rc5.jar:?]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) ~[?:?]
at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
已经按照文档引入了flink-shaded-hadoop-2-uber-2.8.3-10.0.jar包,还是一直报错,始终无法修复,请问怎么办?发布于:6个月前 (12-19) IP属地:四川省
1 个回答
这个dinky使用起来确实会经常出现包冲突的问题。上诉问题如果你使用的是dinky1.0以上的版本,不要在dinky中添加flink-shaded-hadoop-2-uber-2.8.3-10.0.jar,要使用这个指定版本的包:flink-shaded-hadoop-3-uber-3.1.1.7.2.1.0-327-9.0.jar,下载地址:
https://mvnrepository.com/artifact/org.apache.flink/flink-shaded-hadoop-3-uber/3.1.1.7.2.1.0-327-9.0
用这个指定版本的包替换掉你的包即可,不使用这个指定版本的包必报错。发布于:6个月前 (12-19) IP属地:四川省
我来回答
您需要 登录 后回答此问题!