hive的udf怎么引入依赖呢,我通过下面方式添加了第三方依赖
add file /home/chenxy/hive/GeoLite2-City.mmdb;
add jar /home/chenxy/jars/spark-hive_2.11-2.1.0.jar;
add jar /home/chenxy/jars/geoip2-2.12.0.jar;
add jar /home/chenxy/jars/IpCity.jar;
create temporary function ip2poi as 'com.tianzhuo.portrait.Ip2PoiUDF';
select ip2poi("183.128.104.19");
select ip2poi("183.128.104.19");操作后错误日志如下:
FAILED: SemanticException [Error 10014]: Line 1:7 Wrong arguments '"183.128.104.19"': org.apache.hadoop.hive.ql.metadata.HiveException: Unable to execute method public java.lang.String com.tianzhuo.portrait.Ip2PoiUDF.evaluate(java.lang.String) on object com.tianzhuo.portrait.Ip2PoiUDF@23592946 of class com.tianzhuo.portrait.Ip2PoiUDF with arguments {183.128.104.19:java.lang.String} of size 1
如果evaluate函数直接返回字符串是没有问题的,加上geoip2-2.12.0.jar相关代码就会报错
求助大佬们,困扰我两天了,百度谷歌搜遍了,也没合适解决方案