HDC調試需求開發(fā)(15萬預算),能者速來!>>> 【問題描述】 在虛擬機器centos 系統上 連接遠程mysql服務器 用命令方式 可以遠程登錄 wang@localhost wordpress]$ mysql -h 10.73.144.231 -u root -p -P 3306 Enter password: Welcome to the MariaDB monitor. Commands end with ; or \g. Your MySQL connection id is 4 Server version: 5.6.20 MySQL Community Server (GPL)
說明 遠程mysql 已經設置為任何機器都可以登錄。原來命令方式不可以登錄現在可以登錄了 但是通過php程序方式就登錄不上了 但是 我在centos 安裝phpMyAdmin 是 配置文件修改了 /** * MySQL hostname or IP address * * @global string $cfg['Servers'][$i]['host'] */ $cfg['Servers'][$i]['host'] = '10.73.144.231'; /** * MySQL port - leave blank for default port * * @global string $cfg['Servers'][$i]['port'] */ $cfg['Servers'][$i]['port'] = '3306'; /** * Path to the socket - leave blank for default socket * * @global string $cfg['Servers'][$i]['socket'] */ $cfg['Servers'][$i]['socket'] = ''; /** * Use SSL for connecting to MySQL server? * * @global boolean $cfg['Servers'][$i]['ssl'] */
項目代碼生成的過程中不報錯,但是當訪問這個生成的Jsp頁面時,就會報如下的錯誤: org.apache.jasper.JasperException: /WEB-INF/blog_file/guopengfei/1/1477405966557.jsp (line:14, column:9) #{...} is not allowed in template text at org.apache.jasper.compiler.DefaultErrorHandler.jspError(DefaultErrorHandler.java:42) at org.apache.jasper.compiler.ErrorDispatcher.dispatch(ErrorDispatcher.java:443)
HDC調試需求開發(fā)(15萬預算),能者速來!>>> 這兩天遇到了一個比較詭異的問題,就是編輯Hive的UDF函數,并提交需要執(zhí)行MapReduce的SQL后,提示報錯 org.apache.hadoop.hive.ql.exec.UDFArgumentException: The UDF implementation class 'xxxxx' is not present inthe class path 具體代碼如下 ``` package com.mzm.transformer.hive; import com.mzm.common.GlobalConstants; import com.mzm.utils.JdbcManager; import org.apache.commons.lang.StringUtils; import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.hive.ql.exec.UDF; import org.apache.hadoop.io.IntWritable; import org.apache.hadoop.io.Text; import java.sql.Connection; import java.sql.PreparedStatement; import java.sql.ResultSet; import java.sql.SQLException; import java.util.LinkedHashMap; import java.util.Map; /** * 獲取訂單信息的UDF * Created by Administrator on 2017/7/12. */ public class OrderInfoUDF extends UDF { //數據庫連接 private Connection conn = null; //緩存 private Map cache = new LinkedHashMap() { @Override protected boolean removeEldestEntry(Map.Entry eldest) { return cache.size() > 100; } }; public OrderInfoUDF() { Configuration conf = new Configuration(); conf.addResource("transformer-env.xml"); try { conn = JdbcManager.getConnection(conf, GlobalConstants.WAREHOUSE_OF_REPORT); } catch (SQLException e) { throw new RuntimeException("創(chuàng)建MySQL連接異常", e); } //添加一個鉤子進行關閉操作 Runtime.getRuntime().addShutdownHook(new Thread(new Runnable() { public void run() { JdbcManager.close(conn, null, null); } })); } /** * 根據訂單ID和標志位,獲取對應的訂單值 * * @param orderId * @param flag * @return */ public Text evaluate(Text orderId, Text flag) { if (orderId == null || flag == null || StringUtils.isBlank(orderId.toString().trim()) || StringUtils.isBlank(flag.toString().trim())) { throw new IllegalArgumentException("參數異常,訂單id不能為空"); } String order = orderId.toString(); InnerOrderInfo info = fetchInnerOrderInfo(order); Text defaultValue = new Text(GlobalConstants.DEFAULT_VALUE); String str = flag.toString(); if ("pl".equals(str)) { return info == null || StringUtils.isBlank(info.getPlatform()) ? defaultValue : new Text (info.getPlatform()); } if ("cut".equals(str)) { return info == null || StringUtils.isBlank(info.getCurrencyType()) ? defaultValue : new Text(info.getCurrencyType()); } if ("pt".equals(str)) { return info == null || StringUtils.isBlank(info.getPaymentType()) ? defaultValue : new Text (info.getPaymentType()); } throw new IllegalArgumentException("參數異常flag必須為(pl,cut,pt)中的一個,給定的是:" + flag); } /** * 根據訂單ID,獲取訂單金額 * * @param orderId * @return */ public IntWritable evaluate(Text orderId) { if (orderId == null || StringUtils.isBlank(orderId.toString().trim())) { throw new IllegalArgumentException("參數異常,訂單id不能為空"); } String order = orderId.toString(); InnerOrderInfo info = fetchInnerOrderInfo(order); return info == null ? new IntWritable(0) : new IntWritable(info.getAmount()); } /** * 根據訂單ID,獲取訂單的信息 * * @param orderId * @return */ private InnerOrderInfo fetchInnerOrderInfo(String orderId) { InnerOrderInfo info = cache.get(orderId); if (info != null) { return info; } PreparedStatement pstmt = null; ResultSet rs = null; info = new InnerOrderInfo(); try { pstmt = conn.prepareStatement("select order_id,platform,s_time,currency_type,payment_type," + "amount from order_info where order_id=?"); int i = 0; pstmt.setString(++i, orderId.toString().trim()); rs = pstmt.executeQuery(); if (rs.next()) { info.setOrderId(rs.getString("order_id")); info.setPlatform(rs.getString("platform")); info.setCurrencyType(rs.getString("currency_type")); info.setPaymentType(rs.getString("payment_type")); info.setsTime(rs.getLong("s_time")); info.setAmount(rs.getInt("amount")); } return info; } catch (SQLException e) { throw new RuntimeException("查詢數據庫時發(fā)生異常", e); } finally { JdbcManager.close(null, pstmt, rs); } } /** * 內部類 */ private static class InnerOrderInfo { private String OrderId; private String currencyType; private String paymentType; private String platform; private long sTime; private int amount; public InnerOrderInfo() { } public InnerOrderInfo(String orderId, String currencyType, String paymentType, String platform, long sTime, int amount) { OrderId = orderId; this.currencyType = currencyType; this.paymentType = paymentType; this.platform = platform; this.sTime = sTime; this.amount = amount; } public String getOrderId() { return OrderId; } public void setOrderId(String orderId) { OrderId = orderId; } public String getCurrencyType() { return currencyType; } public void setCurrencyType(String currencyType) { this.currencyType = currencyType; } public String getPaymentType() { return paymentType; } public void setPaymentType(String paymentType) { this.paymentType = paymentType; } public String getPlatform() { return platform; } public void setPlatform(String platform) { this.platform = platform; } public long getsTime() { return sTime; } public void setsTime(long sTime) { this.sTime = sTime; } public int getAmount() { return amount; } public void setAmount(int amount) { this.amount = amount; } } } ``` 具體報錯信息如下: ``` hive (default)> select order_information(oid,'pl') as pl,from_unixtime(cast(s_time/1000 as bigint),'yyyy-MM-dd') as date,order_information(oid,'cut') as cut,order_information(oid,'pt') as pt,count(distinct oid) as orders from event_logs where en='e_cs' and pl is not null and s_time>=unix_timestamp('2017-07-05','yyyy-MM-dd')*1000 and s_timeDiagnostic Messages for this Task: Error: java.lang.RuntimeException: Error in configuring object at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:109) at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:75) at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133) at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:446) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1642) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163) Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:106) ... 9 more Caused by: java.lang.RuntimeException: Error in configuring object at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:109) at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:75) at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133) at org.apache.hadoop.mapred.MapRunner.configure(MapRunner.java:38) ... 14 more Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:106) ... 17 more Caused by: java.lang.RuntimeException: Map operator initialization failed at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.configure(ExecMapper.java:157) ... 22 more Caused by: org.apache.hadoop.hive.ql.exec.UDFArgumentException: The UDF implementation class 'com.mzm.transformer.hive.OrderInfoUDF' is not present in the class path at org.apache.hadoop.hive.ql.udf.generic.GenericUDFBridge.initialize(GenericUDFBridge.java:143) at org.apache.hadoop.hive.ql.udf.generic.GenericUDF.initializeAndFoldConstants(GenericUDF.java:116) at org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator.initialize(ExprNodeGenericFuncEvaluator.java:127) at org.apache.hadoop.hive.ql.exec.GroupByOperator.initializeOp(GroupByOperator.java:216) at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:376) at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:460) at org.apache.hadoop.hive.ql.exec.Operator.initializeChildren(Operator.java:416) at org.apache.hadoop.hive.ql.exec.SelectOperator.initializeOp(SelectOperator.java:65) at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:376) at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:460) at org.apache.hadoop.hive.ql.exec.Operator.initializeChildren(Operator.java:416) at org.apache.hadoop.hive.ql.exec.FilterOperator.initializeOp(FilterOperator.java:83) at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:376) at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:460) at org.apache.hadoop.hive.ql.exec.Operator.initializeChildren(Operator.java:416) at org.apache.hadoop.hive.ql.exec.TableScanOperator.initializeOp(TableScanOperator.java:189) at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:376) at org.apache.hadoop.hive.ql.exec.MapOperator.initializeOp(MapOperator.java:427) at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:376) at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.configure(ExecMapper.java:136) ... 22 more FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask MapReduce Jobs Launched: Stage-Stage-1: Map: 1 Reduce: 1 HDFS Read: 0 HDFS Write: 0 FAIL Total MapReduce CPU Time Spent: 0 msec ``` jar包也打了好幾次了,都是報一樣的錯誤,但是只有這個UDF函數報錯,其他的UDF函數就沒有問題,能正常執(zhí)行HQL(無論是否需要MapReduce),甚至是使用這個UDF函數執(zhí)行諸如select order_info(oid) from event_logs這類無需MapReduce的HQL,都能正常執(zhí)行,但就執(zhí)行是需要MapReduce的HQL報錯。 被這個問題給折騰死了,求各位大神幫忙看看
>> mex -setup MEX 配置為使用 'Microsoft Visual C++ 2013 Professional (C)' 以進行 C 語言編譯。 Warning: The MATLAB C and Fortran API has changed to support MATLAB variables with more than 2^32-1 elements. In the near future you will be required to update your code to utilize the new API. You can find more information about this at: http://www.mathworks.com/help/matlab/matlab_external/upgrading-mex-files-to-use-64-bit-api.html.
要選擇不同的語言,請從以下選項中選擇一種命令 : mex -setup C++ mex -setup FORTRAN MEX 配置為使用 'Microsoft Visual C++ 2013 Professional' 以進行 C++ 語言編譯。 Warning: The MATLAB C and Fortran API has changed to support MATLAB variables with more than 2^32-1 elements. In the near future you will be required to update your code to utilize the new API. You can find more information about this at: http://www.mathworks.com/help/matlab/matlab_external/upgrading-mex-files-to-use-64-bit-api.html. >> mbuild -setup MBUILD 配置為使用 'Microsoft Visual C++ 2013 Professional (C)' 以進行 C 語言編譯。
要選擇不同的語言,請從以下選項中選擇一種命令 : mex -setup C++ -client MBUILD mex -setup FORTRAN -client MBUILD MBUILD 配置為使用 'Microsoft Visual C++ 2013 Professional' 以進行 C++ 語言編譯。 >> mcc -W cpplib:libmyFunc -T link:lib myFunc 使用 'Microsoft Visual C++ 2013 Professional' 編譯。 MEX 已成功完成。