Python, list of dictionaries遍历筛选

图片说明

说明:原始数据导进来,index3是字符串格式(json)。

需求:提取index3里字典格式下所有“name”的值。

现有思路:用json.loads转码成list of dictionaries格式,然后循环遍历index3,遍历list所有字典,遍历字典里的key值,判断字典key=="name",取value。

说明:同一个list里面多个dictionary,且key值有重复;json转码前带单引号字符串str。

问题:三层循环,数据量稍微大一点速度极慢,求大神帮帮忙有没有新的思路!

1个回答

只取name的值的话,可能正则匹配会快一些吧。首先取出第三例,每一行的列表转为字符串,使用re.findall去匹配name后面的值

import re
ss = [{'name':'aaa','age':'17'},{'name':'bbb','age':'17'}]
ss = str(ss)
# print(ss)

s = re.findall(r".*?name': '([a-z]+)",ss)
print(s)

'''
>>['aaa', 'bbb']
'''
Csdn user default icon
上传中...
上传图片
插入图片
抄袭、复制答案,以达到刷声望分或其他目的的行为,在CSDN问答是严格禁止的,一经发现立刻封号。是时候展现真正的技术了!
其他相关推荐
Python,json转码,list of dictionaries遍历筛选
问题说明:原始数据导入字段index3就是json字符串格式(带单引号str),想要提取字典里面key值为"name"的所有value; ``` df_tmp #pandas dataframe #字段index3是json字符串格式(带单引号str),去掉单引号就是list of dics df_tmp["index3"][0] '[{"name": "Mary", "age":"7", "Sex":"F"},{"name":"Jack", "age":"11","Sex":"M"}]' df_tmp["index3"][1] '[{"name":"Jack", "age":"11","Sex":"M"},{"name":"Lucy","age":"9","Sex":"F"},{"name":"Nancy", "age":"10","Sex":"F"}]' df_tmp["index3"][2] '[{"name": "Luke", "age":"6", "Sex":"F"},{"name":"Lily", "age":"11","Sex":"F"}]' ``` 已有解决方案:把json转码成list of dictionaries,三层循环,遍历dataframe,遍历list,遍历字典; 问题:数据量稍微多一点,速度特别慢,python小白求大神想想其他的方案,比如pandas有没有好用的函数,可不可以用mysql处理?
(自己搞定了)C++编写一个在单词本内单词测试的程序,求大佬捉虫
1.错误代码LINK2019,LINK1120 ``` #include "exam.h" #include "ui.h" #include <iostream> #include <string> #include "stdlib.h" #include "time.h" using namespace std; static const string exam_ui_prompt = "\n\n" "\t|*******************************|\n" "\t|* 请您选择考试类别 *|\n" "\t|* 1 看中文写英文 *|\n" "\t|* 2 看英文写中文 *|\n" "\t|* 3 看解释写成语 *|\n" "\t|* 0 返回上级 *|\n" "\t 请输入(1,2,3或0): "; void exam_ui() { void exam_ui_english_for_chinese(); { //装载词库 dict_load( &dictionaries[ENGLISH_CHINESE], DICT_PATH DICT_ENGLISH_CHINESE ); //生成考试题目 Examination exam; exam_create(&exam, &dictionaries[ENGLISH_CHINESE]); for (int i = 0; i < exam.words.size(); i++) { CLEAR(); cout << "\n\n"; cout << "\t|**************************|\n"; meaning_display(exam.meanings[i]); cout << "\n"; cout << "\t|*************************|\n"; string word = get_input_string("\t 请输入英文答案(0 退出) : "); cout << "\t 请输入英文答案 (0 退出):"; std::getline(cin, word); if (word == exam.words[i]) { cout << "\t 正确!!!" << endl; string op = get_input_string("\t 请输入(0 退出, 其他 下一题):"); if (bool op = "0") { break; } } else if (word == "0") { break; } else { cout << "\t 错误, 继续加油哦" << endl; //继续下去还答这道题 i--; } } } CLEAR(); cout << exam_ui_prompt; string op; std::getline(cin, op); if (op == "1") { exam_ui_english_for_chinese(); } else if (op == "2") { } else if (op == "0") { return; } exam_ui(); } /** 生成一个[min,max)之间的随机整数 */ static int radom_index(int min, int max) { return (int)(min + (double)rand() / (double)RAND_MAX * (max - min)); } /* 从字典随机生成一个试卷 */ void exam_create(Examination* exam, Dictionary* dict) { srand(time(NULL)); int wordcount = dict->words.size(); int itemcount = wordcount < 10 ? wordcount : 10; for (int i = 0; i < itemcount; i++) { int idx = radom_index(0, wordcount); exam->words.push_back(dict->words[idx]); exam->meanings.push_back(dict->meanings[idx]); } } void exam_ui_display() { return; } void exam_ui_chinese_for_english() { return; } void exam_ui_english_for_chinese() { } ``` 3.我看我的头文件里都声明定义了但我还是不知道怎么解决这个错误 ``` #ifndef _EXAM_H #define _EXAM_H #include <string> #include <vector> #include "dict.h" using namespace std; typedef struct { vector<string>words; vector< vector < string > >meanings; }Examination; void exam_create(Examination* exam, Dictionary* dict); void exam_ui_display(); void exam_ui_chinese_for_english(); void exam_ui_english_for_chinese(); bool op; string word; #endif//!_EXAM_H ```
e-Market 的计算的问题
Description The city of Hakodate recently established a commodity exchange market. To participate in the market, each dealer transmits through the Internet an order consisting of his or her name, the type of the order (buy or sell), the name of the commodity, and the quoted price. In this market a deal can be made only if the price of a sell order is lower than or equal to the price of a buy price. The price of the deal is the mean of the prices of the buy and sell orders, where the mean price is rounded downward to the nearest integer. To exclude dishonest deals, no deal is made between a pair of sell and buy orders from the same dealer. The system of the market maintains the list of orders for which a deal has not been made and processes a new order in the following manner. For a new sell order, a deal is made with the buy order with the highest price in the list satisfying the conditions. If there is more than one buy order with the same price, the deal is made with the earliest of them. For a new buy order, a deal is made with the sell order with the lowest price in the list satisfying the conditions. If there is more than one sell order with the same price, the deal is made with the earliest of them. The market opens at 7:00 and closes at 22:00 everyday. When the market closes, all the remaining orders are cancelled. To keep complete record of the market, the system of the market saves all the orders it received everyday. The manager of the market asked the system administrator to make a program which reports the activity of the market. The report must contain two kinds of information. For each commodity the report must contain information on the lowest, the average and the highest prices of successful deals. For each dealer, the report must contain information on the amounts the dealer paid and received for commodities. Input The input contains several data sets. Each data set represents the record of the market on one day. The first line of each data set contains an integer n (n < 1000) which is the number of orders in the record. Each line of the record describes an order, consisting of the name of the dealer, the type of the order, the name of the commodity and the quoted price. They are separated by a single space character. The name of a dealer consists of capital alphabetical letters and is less than 10 characters in length. The type of an order is indicated by a string, "BUY" or "SELL". The name of a commodity is a single capital letter. The quoted price is a positive integer less than 1000. The orders in a record are arranged according to time when they were received and the first line of the record corresponds to the oldest order. The end of the input is indicated by a line containing a zero. Output The output for each data set consists of two parts separated by a line containing two hyphen ('-') characters. The first part is output for commodities. For each commodity, your program should output the name of the commodity and the lowest, the average and the highest prices of the successful deals in on line. The name and the prices in a line should be separated by a space character. The average price is rounded downward to the nearest integer. The output should contain only the commodities for which deals are made and the order of the output must be alphabetic. The second part is output for dealers. For each dealer, your program should output the name of the dealer, the amounts the dealer paid and received for commodities. The name and the numbers in a line should be separated by a space character. The output should contain all the dealers who transmitted orders. The order of dealers in the output must be lexicographic on their names. The lexicographic order is the order in which words in dictionaries are arranged. The output for each data set should be followed by a line containing ten hyphen ('-') characters. Sample Input 3 PERLIS SELL A 300 WILKES BUY A 200 HAMMING SELL A 100 4 BACKUS SELL A 10 FLOYD BUY A 20 IVERSON SELL B 30 BACKUS BUY B 40 7 WILKINSON SELL A 500 MCCARTHY BUY C 300 WILKINSON SELL C 200 DIJKSTRA SELL B 100 BACHMAN BUY A 400 DIJKSTRA BUY A 600 WILKINSON SELL A 300 2 ABCD SELL X 10 ABC BUY X 15 2 A SELL M 100 A BUY M 100 0 Sample Output A 150 150 150 -- HAMMING 0 150 PERLIS 0 0 WILKES 150 0 ---------- A 15 15 15 B 35 35 35 -- BACKUS 35 15 FLOYD 15 0 IVERSON 0 35 ---------- A 350 450 550 C 250 250 250 -- BACHMAN 350 0 DIJKSTRA 550 0 MACCARTHY 250 0 WILKINSON 0 1150 ---------- X 12 12 12 -- ABC 12 0 ABCD 0 12 ---------- -- A 0 0 ----------
为什么这个treeSet 加了一个比较器就可以存储重复值了
``` package Work1; import java.util.ArrayList; import java.util.Comparator; import java.util.List; public class DictionariesDemo { public static void main(String[] args) { List<String> list = new ArrayList<String>(); list.add("list"); list.add("sum"); list.add("list"); list.add("avg"); list.add("dictionaries"); list.add("demo"); ComparableStr comparableStr = new ComparableStr(); comparableStr.sort(list); for(String li:list){ System.out.println(li); } } } ``` ``` package Work1; import java.util.Comparator; import java.util.List; import java.lang.String; import java.util.TreeSet; public class ComparableStr { public void sort(List<String> list){ TreeSet<String> treeSet= new TreeSet<String>(new Comparator<String>() { @Override public int compare(String o1, String o2) { int flag = o1.compareTo(o2); if(flag>=0){ flag=1; } return flag; } }); treeSet.addAll(list); list.clear(); list.addAll(treeSet); } } ```
e-Market 如何正确实现的
Description The city of Hakodate recently established a commodity exchange market. To participate in the market, each dealer transmits through the Internet an order consisting of his or her name, the type of the order (buy or sell), the name of the commodity, and the quoted price. In this market a deal can be made only if the price of a sell order is lower than or equal to the price of a buy price. The price of the deal is the mean of the prices of the buy and sell orders, where the mean price is rounded downward to the nearest integer. To exclude dishonest deals, no deal is made between a pair of sell and buy orders from the same dealer. The system of the market maintains the list of orders for which a deal has not been made and processes a new order in the following manner. For a new sell order, a deal is made with the buy order with the highest price in the list satisfying the conditions. If there is more than one buy order with the same price, the deal is made with the earliest of them. For a new buy order, a deal is made with the sell order with the lowest price in the list satisfying the conditions. If there is more than one sell order with the same price, the deal is made with the earliest of them. The market opens at 7:00 and closes at 22:00 everyday. When the market closes, all the remaining orders are cancelled. To keep complete record of the market, the system of the market saves all the orders it received everyday. The manager of the market asked the system administrator to make a program which reports the activity of the market. The report must contain two kinds of information. For each commodity the report must contain information on the lowest, the average and the highest prices of successful deals. For each dealer, the report must contain information on the amounts the dealer paid and received for commodities. Input The input contains several data sets. Each data set represents the record of the market on one day. The first line of each data set contains an integer n (n < 1000) which is the number of orders in the record. Each line of the record describes an order, consisting of the name of the dealer, the type of the order, the name of the commodity and the quoted price. They are separated by a single space character. The name of a dealer consists of capital alphabetical letters and is less than 10 characters in length. The type of an order is indicated by a string, "BUY" or "SELL". The name of a commodity is a single capital letter. The quoted price is a positive integer less than 1000. The orders in a record are arranged according to time when they were received and the first line of the record corresponds to the oldest order. The end of the input is indicated by a line containing a zero. Output The output for each data set consists of two parts separated by a line containing two hyphen ('-') characters. The first part is output for commodities. For each commodity, your program should output the name of the commodity and the lowest, the average and the highest prices of the successful deals in on line. The name and the prices in a line should be separated by a space character. The average price is rounded downward to the nearest integer. The output should contain only the commodities for which deals are made and the order of the output must be alphabetic. The second part is output for dealers. For each dealer, your program should output the name of the dealer, the amounts the dealer paid and received for commodities. The name and the numbers in a line should be separated by a space character. The output should contain all the dealers who transmitted orders. The order of dealers in the output must be lexicographic on their names. The lexicographic order is the order in which words in dictionaries are arranged. The output for each data set should be followed by a line containing ten hyphen ('-') characters. Sample Input 3 PERLIS SELL A 300 WILKES BUY A 200 HAMMING SELL A 100 4 BACKUS SELL A 10 FLOYD BUY A 20 IVERSON SELL B 30 BACKUS BUY B 40 7 WILKINSON SELL A 500 MCCARTHY BUY C 300 WILKINSON SELL C 200 DIJKSTRA SELL B 100 BACHMAN BUY A 400 DIJKSTRA BUY A 600 WILKINSON SELL A 300 2 ABCD SELL X 10 ABC BUY X 15 2 A SELL M 100 A BUY M 100 0 Sample Output A 150 150 150 -- HAMMING 0 150 PERLIS 0 0 WILKES 150 0 ---------- A 15 15 15 B 35 35 35 -- BACKUS 35 15 FLOYD 15 0 IVERSON 0 35 ---------- A 350 450 550 C 250 250 250 -- BACHMAN 350 0 DIJKSTRA 550 0 MACCARTHY 250 0 WILKINSON 0 1150 ---------- X 12 12 12 -- ABC 12 0 ABCD 0 12 ---------- -- A 0 0 ----------
请教map list容器问题
请问这个题目怎么用c++实现 Write a Python function called tally that consumes a list of strings representing all the participant’s town visits (called log) and produces a dictionary with the participants as the keys and the total points earned as the corresponding values. Notes: • Each element in log is a string with three comma separated values: the first value is the participant id, the second is the town, and the third is the points earned in that town. • If the participant visited a town more than once, then tally must assign None to that participant’s total points to show they are disqualified. • The list tally may be empty, but all strings in tally will be in the correct format. • Recall dictionaries do not have an associated order: two dictionaries are equal if they have the same set of keys, and each key has the same associated value. This means your produced dictionary may have keys in any order when printed. Example 1: tally([ "jsmith,Elora,2", "jsmith,St. Jacobs,4", "klee,Elora,3", "proth,Conestogo,4", "kafka,Heidelberg,2", "klee,Heidelberg,5", "kafka,Elora,1", "klee,St. Jacobs,5", "jsmith,Heidelberg,1"]) => { "jsmith" : 7, "klee" : 13, "kafka" : 3, "proth" : 4 } Example 2: tally([ "ricky,Linwood,4", "janed,Linwood,3", "janed,Wallenstein,2", "ricky,Linwood,5", "mog,Conestogo,2"]) => { "mog" : 2, "ricky" : None, "janed" : 5 }
asp.net core+EF架构查询数据库时报错
asp.net core+EF架构查询数据库时报错 Entities代码 ``` [Table("Dictionary")] [Serializable] public partial class Dictionary { /// <summary> /// 数据字典ID /// </summary> public Guid Id { get; set; } /// <summary> /// 字典名称 /// </summary> public string Name { get; set; } /// <summary> /// 字典编码 /// </summary> public string Code { get; set; } /// <summary> /// 数据类型 /// </summary> public int? Type { get; set; } /// <summary> /// 父级ID /// </summary> public Guid Pid { get; set; } /// <summary> /// 排序权重 /// </summary> public int? Sort { get; set; } /// <summary> /// 是否启用 /// </summary> public bool Enabled { get; set; } /// <summary> /// 是否删除 /// </summary> public bool Deleted { get; set; } /// <summary> /// 创建用户ID /// </summary> public Guid CreateId { get; set; } /// <summary> /// 创建日期 /// </summary> public DateTime? CreateTime { get; set; } /// <summary> /// 更新用户ID /// </summary> public Guid UpdateId { get; set; } /// <summary> ///更新日期 /// </summary> public DateTime? UpdateTime { get; set; } /// <summary> /// 备注 /// </summary> public string Remark { get; set; } } 上下文代码 public class EntityDbContext : DbContext { public EntityDbContext(DbContextOptions options) : base(options) { } public DbSet<Category> Categories { get; set; } public DbSet<SysUser> SysUsers { get; set; } public DbSet<SysUserToken> SysUserTokenes { get; set; } public DbSet<SysUserLoginLog> SysUserLoginLogs { get; set; } public DbSet<SysUserRole> SysUserRoles { get; set; } public DbSet<SysPermission> SysPermissions { get; set; } public DbSet<Dictionary> Dictionaries { get; set; } } 调用代码 private IRepository<Entities.Dictionary> _dictionaryRepository; private IHttpContextAccessor _accessor; public DictionaryService(IRepository<Entities.Dictionary> dictionaryRepository, IHttpContextAccessor accessor) { this._dictionaryRepository = dictionaryRepository; this._accessor = accessor; } /// <summary> /// 获取所有数据字典并缓存 /// </summary> /// <returns></returns> public List<Entities.Dictionary> getAll() { var result = _dictionaryRepository.Table.Where(o => !o.Deleted); var ex = result.Expression; //result = result.Where(o => o.Code == "KGJ-2019-T-001" || o.Code == "KGJ-2019-T-002"); //return result.OrderBy(o => o.Type).ThenBy(o => o.CreateTime).ToList(); throw new NotImplementedException(ex.ToString()); } ``` 报错问题:![图片说明](https://img-ask.csdn.net/upload/201908/27/1566883198_22483.png)
redis运行一段时间后,客户端链接不上
redis运行一天后,redis-cli或者使用jredis链接redis服务,提示连接超时或者链接拒绝。重启redis服务就可以重新使用了!大牛没帮忙看看什么问题? 配置文件如下: port 6379 tcp-backlog 511 timeout 60000 tcp-keepalive 0 loglevel debug logfile "D:\\redis-2.8.19\\redis.log" databases 16 save 900 1 save 300 10 save 60 10000 stop-writes-on-bgsave-error yes rdbcompression yes rdbchecksum yes dbfilename dump.rdb dir ./ slave-serve-stale-data yes slave-read-only yes repl-diskless-sync no repl-diskless-sync-delay 5 repl-disable-tcp-nodelay no slave-priority 100 requirepass foobared maxclients 10000 maxheap 2gb maxmemory 2gb maxmemory-policy volatile-lru appendonly no appendfsync everysec no-appendfsync-on-rewrite no auto-aof-rewrite-percentage 100 auto-aof-rewrite-min-size 64mb aof-load-truncated yes lua-time-limit 5000 slowlog-log-slower-than 10000 slowlog-max-len 128 ################################ LATENCY MONITOR ############################## # The Redis latency monitoring subsystem samples different operations # at runtime in order to collect data related to possible sources of # latency of a Redis instance. # # Via the LATENCY command this information is available to the user that can # print graphs and obtain reports. # # The system only logs operations that were performed in a time equal or # greater than the amount of milliseconds specified via the # latency-monitor-threshold configuration directive. When its value is set # to zero, the latency monitor is turned off. # # By default latency monitoring is disabled since it is mostly not needed # if you don't have latency issues, and collecting data has a performance # impact, that while very small, can be measured under big load. Latency # monitoring can easily be enalbed at runtime using the command # "CONFIG SET latency-monitor-threshold <milliseconds>" if needed. latency-monitor-threshold 0 ############################# Event notification ############################## # Redis can notify Pub/Sub clients about events happening in the key space. # This feature is documented at http://redis.io/topics/notifications # # For instance if keyspace events notification is enabled, and a client # performs a DEL operation on key "foo" stored in the Database 0, two # messages will be published via Pub/Sub: # # PUBLISH __keyspace@0__:foo del # PUBLISH __keyevent@0__:del foo # # It is possible to select the events that Redis will notify among a set # of classes. Every class is identified by a single character: # # K Keyspace events, published with __keyspace@<db>__ prefix. # E Keyevent events, published with __keyevent@<db>__ prefix. # g Generic commands (non-type specific) like DEL, EXPIRE, RENAME, ... # $ String commands # l List commands # s Set commands # h Hash commands # z Sorted set commands # x Expired events (events generated every time a key expires) # e Evicted events (events generated when a key is evicted for maxmemory) # A Alias for g$lshzxe, so that the "AKE" string means all the events. # # The "notify-keyspace-events" takes as argument a string that is composed # of zero or multiple characters. The empty string means that notifications # are disabled. # # Example: to enable list and generic events, from the point of view of the # event name, use: # # notify-keyspace-events Elg # # Example 2: to get the stream of the expired keys subscribing to channel # name __keyevent@0__:expired use: # # notify-keyspace-events Ex # # By default all notifications are disabled because most users don't need # this feature and the feature has some overhead. Note that if you don't # specify at least one of K or E, no events will be delivered. notify-keyspace-events "" ############################### ADVANCED CONFIG ############################### # Hashes are encoded using a memory efficient data structure when they have a # small number of entries, and the biggest entry does not exceed a given # threshold. These thresholds can be configured using the following directives. hash-max-ziplist-entries 512 hash-max-ziplist-value 64 # Similarly to hashes, small lists are also encoded in a special way in order # to save a lot of space. The special representation is only used when # you are under the following limits: list-max-ziplist-entries 512 list-max-ziplist-value 64 # Sets have a special encoding in just one case: when a set is composed # of just strings that happen to be integers in radix 10 in the range # of 64 bit signed integers. # The following configuration setting sets the limit in the size of the # set in order to use this special memory saving encoding. set-max-intset-entries 512 # Similarly to hashes and lists, sorted sets are also specially encoded in # order to save a lot of space. This encoding is only used when the length and # elements of a sorted set are below the following limits: zset-max-ziplist-entries 128 zset-max-ziplist-value 64 # HyperLogLog sparse representation bytes limit. The limit includes the # 16 bytes header. When an HyperLogLog using the sparse representation crosses # this limit, it is converted into the dense representation. # # A value greater than 16000 is totally useless, since at that point the # dense representation is more memory efficient. # # The suggested value is ~ 3000 in order to have the benefits of # the space efficient encoding without slowing down too much PFADD, # which is O(N) with the sparse encoding. The value can be raised to # ~ 10000 when CPU is not a concern, but space is, and the data set is # composed of many HyperLogLogs with cardinality in the 0 - 15000 range. hll-sparse-max-bytes 3000 # Active rehashing uses 1 millisecond every 100 milliseconds of CPU time in # order to help rehashing the main Redis hash table (the one mapping top-level # keys to values). The hash table implementation Redis uses (see dict.c) # performs a lazy rehashing: the more operation you run into a hash table # that is rehashing, the more rehashing "steps" are performed, so if the # server is idle the rehashing is never complete and some more memory is used # by the hash table. # # The default is to use this millisecond 10 times every second in order to # actively rehash the main dictionaries, freeing memory when possible. # # If unsure: # use "activerehashing no" if you have hard latency requirements and it is # not a good thing in your environment that Redis can reply from time to time # to queries with 2 milliseconds delay. # # use "activerehashing yes" if you don't have such hard requirements but # want to free memory asap when possible. activerehashing yes # The client output buffer limits can be used to force disconnection of clients # that are not reading data from the server fast enough for some reason (a # common reason is that a Pub/Sub client can't consume messages as fast as the # publisher can produce them). # # The limit can be set differently for the three different classes of clients: # # normal -> normal clients including MONITOR clients # slave -> slave clients # pubsub -> clients subscribed to at least one pubsub channel or pattern # # The syntax of every client-output-buffer-limit directive is the following: # # client-output-buffer-limit <class> <hard limit> <soft limit> <soft seconds> # # A client is immediately disconnected once the hard limit is reached, or if # the soft limit is reached and remains reached for the specified number of # seconds (continuously). # So for instance if the hard limit is 32 megabytes and the soft limit is # 16 megabytes / 10 seconds, the client will get disconnected immediately # if the size of the output buffers reach 32 megabytes, but will also get # disconnected if the client reaches 16 megabytes and continuously overcomes # the limit for 10 seconds. # # By default normal clients are not limited because they don't receive data # without asking (in a push way), but just after a request, so only # asynchronous clients may create a scenario where data is requested faster # than it can read. # # Instead there is a default limit for pubsub and slave clients, since # subscribers and slaves receive data in a push fashion. # # Both the hard or the soft limit can be disabled by setting them to zero. client-output-buffer-limit normal 0 0 0 client-output-buffer-limit slave 256mb 64mb 60 client-output-buffer-limit pubsub 32mb 8mb 60 # Redis calls an internal function to perform many background tasks, like # closing connections of clients in timeot, purging expired keys that are # never requested, and so forth. # # Not all tasks are perforemd with the same frequency, but Redis checks for # tasks to perform according to the specified "hz" value. # # By default "hz" is set to 10. Raising the value will use more CPU when # Redis is idle, but at the same time will make Redis more responsive when # there are many keys expiring at the same time, and timeouts may be # handled with more precision. # # The range is between 1 and 500, however a value over 100 is usually not # a good idea. Most users should use the default of 10 and raise this up to # 100 only in environments where very low latency is required. hz 10 # When a child rewrites the AOF file, if the following option is enabled # the file will be fsync-ed every 32 MB of data generated. This is useful # in order to commit the file to the disk more incrementally and avoid # big latency spikes. aof-rewrite-incremental-fsync yes ################################## INCLUDES ################################### # Include one or more other config files here. This is useful if you # have a standard template that goes to all Redis server but also need # to customize a few per-server settings. Include files can include # other files, so use this wisely. # # include /path/to/local.conf # include /path/to/other.conf
e-Market
Description The city of Hakodate recently established a commodity exchange market. To participate in the market, each dealer transmits through the Internet an order consisting of his or her name, the type of the order (buy or sell), the name of the commodity, and the quoted price. In this market a deal can be made only if the price of a sell order is lower than or equal to the price of a buy price. The price of the deal is the mean of the prices of the buy and sell orders, where the mean price is rounded downward to the nearest integer. To exclude dishonest deals, no deal is made between a pair of sell and buy orders from the same dealer. The system of the market maintains the list of orders for which a deal has not been made and processes a new order in the following manner. For a new sell order, a deal is made with the buy order with the highest price in the list satisfying the conditions. If there is more than one buy order with the same price, the deal is made with the earliest of them. For a new buy order, a deal is made with the sell order with the lowest price in the list satisfying the conditions. If there is more than one sell order with the same price, the deal is made with the earliest of them. The market opens at 7:00 and closes at 22:00 everyday. When the market closes, all the remaining orders are cancelled. To keep complete record of the market, the system of the market saves all the orders it received everyday. The manager of the market asked the system administrator to make a program which reports the activity of the market. The report must contain two kinds of information. For each commodity the report must contain information on the lowest, the average and the highest prices of successful deals. For each dealer, the report must contain information on the amounts the dealer paid and received for commodities. Input The input contains several data sets. Each data set represents the record of the market on one day. The first line of each data set contains an integer n (n < 1000) which is the number of orders in the record. Each line of the record describes an order, consisting of the name of the dealer, the type of the order, the name of the commodity and the quoted price. They are separated by a single space character. The name of a dealer consists of capital alphabetical letters and is less than 10 characters in length. The type of an order is indicated by a string, "BUY" or "SELL". The name of a commodity is a single capital letter. The quoted price is a positive integer less than 1000. The orders in a record are arranged according to time when they were received and the first line of the record corresponds to the oldest order. The end of the input is indicated by a line containing a zero. Output The output for each data set consists of two parts separated by a line containing two hyphen ('-') characters. The first part is output for commodities. For each commodity, your program should output the name of the commodity and the lowest, the average and the highest prices of the successful deals in on line. The name and the prices in a line should be separated by a space character. The average price is rounded downward to the nearest integer. The output should contain only the commodities for which deals are made and the order of the output must be alphabetic. The second part is output for dealers. For each dealer, your program should output the name of the dealer, the amounts the dealer paid and received for commodities. The name and the numbers in a line should be separated by a space character. The output should contain all the dealers who transmitted orders. The order of dealers in the output must be lexicographic on their names. The lexicographic order is the order in which words in dictionaries are arranged. The output for each data set should be followed by a line containing ten hyphen ('-') characters. Sample Input 3 PERLIS SELL A 300 WILKES BUY A 200 HAMMING SELL A 100 4 BACKUS SELL A 10 FLOYD BUY A 20 IVERSON SELL B 30 BACKUS BUY B 40 7 WILKINSON SELL A 500 MCCARTHY BUY C 300 WILKINSON SELL C 200 DIJKSTRA SELL B 100 BACHMAN BUY A 400 DIJKSTRA BUY A 600 WILKINSON SELL A 300 2 ABCD SELL X 10 ABC BUY X 15 2 A SELL M 100 A BUY M 100 0 Sample Output A 150 150 150 -- HAMMING 0 150 PERLIS 0 0 WILKES 150 0 ---------- A 15 15 15 B 35 35 35 -- BACKUS 35 15 FLOYD 15 0 IVERSON 0 35 ---------- A 350 450 550 C 250 250 250 -- BACHMAN 350 0 DIJKSTRA 550 0 MACCARTHY 250 0 WILKINSON 0 1150 ---------- X 12 12 12 -- ABC 12 0 ABCD 0 12 ---------- -- A 0 0 ----------
org.apache.ibatis.exceptions.PersistenceException怎么解决?
调试一个开源项目出现报错,项目所有sql都会报空指针异常 ``` 严重: Servlet.service() for servlet [springMvc] in context with path [/mallapp] threw exception [Request processing failed; nested exception is org.mybatis.spring.MyBatisSystemException: nested exception is org.apache.ibatis.exceptions.PersistenceException: ### Error querying database. Cause: java.lang.NullPointerException ### The error may exist in file [H:\WorksSpace\.metadata\.plugins\org.eclipse.wst.server.core\tmp2\wtpwebapps\mallapp\WEB-INF\classes\mybatis\category\CategoryMapper.xml] ### The error may involve CategoryMapper.listAll-Inline ### The error occurred while setting parameters ### SQL: select category_name, category_img, sort, category_id, super_id from shop_category where super_id =? order by sort ### Cause: java.lang.NullPointerException] with root cause java.lang.NullPointerException at org.apache.ibatis.type.BaseTypeHandler.setParameter(BaseTypeHandler.java:43) at org.apache.ibatis.scripting.defaults.DefaultParameterHandler.setParameters(DefaultParameterHandler.java:81) at org.apache.ibatis.executor.statement.PreparedStatementHandler.parameterize(PreparedStatementHandler.java:80) at org.apache.ibatis.executor.statement.RoutingStatementHandler.parameterize(RoutingStatementHandler.java:61) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.ibatis.plugin.Plugin.invoke(Plugin.java:62) at com.sun.proxy.$Proxy84.parameterize(Unknown Source) at org.apache.ibatis.executor.ReuseExecutor.prepareStatement(ReuseExecutor.java:79) at org.apache.ibatis.executor.ReuseExecutor.doQuery(ReuseExecutor.java:56) at org.apache.ibatis.executor.BaseExecutor.queryFromDatabase(BaseExecutor.java:267) at org.apache.ibatis.executor.BaseExecutor.query(BaseExecutor.java:137) at org.apache.ibatis.executor.CachingExecutor.query(CachingExecutor.java:96) at org.apache.ibatis.executor.CachingExecutor.query(CachingExecutor.java:77) at org.apache.ibatis.session.defaults.DefaultSqlSession.selectList(DefaultSqlSession.java:108) at org.apache.ibatis.session.defaults.DefaultSqlSession.selectList(DefaultSqlSession.java:102) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.mybatis.spring.SqlSessionTemplate$SqlSessionInterceptor.invoke(SqlSessionTemplate.java:358) at com.sun.proxy.$Proxy74.selectList(Unknown Source) at org.mybatis.spring.SqlSessionTemplate.selectList(SqlSessionTemplate.java:198) at com.yq.dao.DaoSupport.findForList(DaoSupport.java:118) at com.yq.service.category.impl.CategoryService.listAll(CategoryService.java:62) at com.yq.service.category.impl.CategoryService$$FastClassBySpringCGLIB$$51d95cd7.invoke(<generated>) at org.springframework.cglib.proxy.MethodProxy.invoke(MethodProxy.java:204) at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.invokeJoinpoint(CglibAopProxy.java:711) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:98) at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:262) at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:95) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.interceptor.ExposeInvocationInterceptor.invoke(ExposeInvocationInterceptor.java:92) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.CglibAopProxy$DynamicAdvisedInterceptor.intercept(CglibAopProxy.java:644) at com.yq.service.category.impl.CategoryService$$EnhancerBySpringCGLIB$$c2253a36.listAll(<generated>) at com.yq.controller.category.CategoryController.list(CategoryController.java:80) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.springframework.web.method.support.InvocableHandlerMethod.invoke(InvocableHandlerMethod.java:215) at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:132) at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:104) at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandleMethod(RequestMappingHandlerAdapter.java:749) at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:690) at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:83) at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:945) at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:876) at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:961) at org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:852) at javax.servlet.http.HttpServlet.service(HttpServlet.java:624) at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:837) at javax.servlet.http.HttpServlet.service(HttpServlet.java:731) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208) at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208) at com.alibaba.druid.support.http.WebStatFilter.doFilter(WebStatFilter.java:123) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208) at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:88) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:108) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208) at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220) at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122) at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:505) at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:169) at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103) at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:956) at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116) at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:436) at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1078) at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:625) at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61) at java.lang.Thread.run(Thread.java:745) ``` CategoryMapper.xml ``` <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE mapper PUBLIC "-//mybatis.org//DTD Mapper 3.0//EN" "http://mybatis.org/dtd/mybatis-3-mapper.dtd"> <mapper namespace="CategoryMapper"> <!--表名 --> <sql id="tableName"> shop_category </sql> <!-- 字段 --> <sql id="Field"> category_name, category_img, sort, category_id, super_id </sql> <!-- 字段值 --> <sql id="FieldValue"> #{category_name}, #{category_img}, #{sort}, #{category_id}, #{super_id} </sql> <!-- 新增--> <insert id="save" parameterType="pd"> insert into <include refid="tableName"></include> ( <include refid="Field"></include> ) values ( <include refid="FieldValue"></include> ) </insert> <!-- 删除--> <delete id="delete" parameterType="pd"> delete from <include refid="tableName"></include> where category_id = #{category_id} </delete> <!-- 修改 --> <update id="edit" parameterType="pd"> update <include refid="tableName"></include> set category_name = #{category_name}, category_img = #{category_img}, sort = #{sort} where category_id = #{category_id} </update> <!-- 通过ID获取数据 --> <select id="findById" parameterType="pd" resultType="pd"> select <include refid="Field"></include> from <include refid="tableName"></include> where category_id = #{category_id} </select> <!-- 列表 --> <select id="datalistPage" parameterType="page" resultType="pd"> select <include refid="Field"></include> from <include refid="tableName"></include> where 1=1 and super_id =#{pd.super_id} order by sort </select> <!-- 列表(全部) --> <select id="listAll" parameterType="pd" resultType="pd"> select <include refid="Field"></include> from <include refid="tableName"></include> where super_id =#{super_id} order by sort </select> <!-- 批量删除 --> <delete id="deleteAll" parameterType="String"> delete from <include refid="tableName"></include> where category_id in <foreach item="item" index="index" collection="array" open="(" separator="," close=")"> #{item} </foreach> </delete> </mapper> ``` mybatis-config.xml ``` <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE configuration PUBLIC "-//mybatis.org//DTD SQL Map Config 3.0//EN" "http://mybatis.org/dtd/mybatis-3-config.dtd"> <configuration> <settings> <setting name="cacheEnabled" value="true" /><!-- 全局映射器启用缓存 --> <setting name="useGeneratedKeys" value="true" /> <setting name="defaultExecutorType" value="REUSE" /> <!-- 打印查询语句 --> <setting name="logImpl" value="LOG4J" /> </settings> <typeAliases> <typeAlias type="org.change.entity.system.User" alias="User"/> <typeAlias type="org.change.entity.system.Role" alias="Role"/> <typeAlias type="org.change.entity.system.Menu" alias="Menu"/> <typeAlias type="org.change.entity.system.Dictionaries" alias="Dictionaries"/> <typeAlias type="org.change.entity.system.Department" alias="Department"/> <typeAlias type="org.change.util.PageData" alias="pd"/> <!-- 分页 --> <typeAlias type="org.change.entity.Page" alias="Page"/> </typeAliases> <plugins> <plugin interceptor="org.change.plugin.PagePlugin"> <property name="dialect" value="mysql"/> <property name="pageSqlId" value=".*listPage.*"/> </plugin> </plugins> </configuration> ``` DAO.java ``` public interface DAO { /** * 保存对象 * @param str * @param obj * @return * @throws Exception */ public Object save(String str, Object obj) throws Exception; /** * 修改对象 * @param str * @param obj * @return * @throws Exception */ public Object update(String str, Object obj) throws Exception; /** * 删除对象 * @param str * @param obj * @return * @throws Exception */ public Object delete(String str, Object obj) throws Exception; /** * 查找对象 * @param str * @param obj * @return * @throws Exception */ public Object findForObject(String str, Object obj) throws Exception; /** * 查找对象 * @param str * @param obj * @return * @throws Exception */ public Object findForList(String str, Object obj) throws Exception; /** * 查找对象封装成Map * @param s * @param obj * @return * @throws Exception */ public Object findForMap(String sql, Object obj, String key , String value) throws Exception; } ``` DaoSupport.java ``` @Repository("daoSupport") public class DaoSupport implements DAO { @Resource(name = "sqlSessionTemplate") private SqlSessionTemplate sqlSessionTemplate; /** * 保存对象 * @param str * @param obj * @return * @throws Exception */ public Object save(String str, Object obj) throws Exception { return sqlSessionTemplate.insert(str, obj); } /** * 批量更新 * @param str * @param obj * @return * @throws Exception */ public Object batchSave(String str, List objs )throws Exception{ return sqlSessionTemplate.insert(str, objs); } /** * 修改对象 * @param str * @param obj * @return * @throws Exception */ public Object update(String str, Object obj) throws Exception { return sqlSessionTemplate.update(str, obj); } /** * 批量更新 * @param str * @param obj * @return * @throws Exception */ public void batchUpdate(String str, List objs )throws Exception{ SqlSessionFactory sqlSessionFactory = sqlSessionTemplate.getSqlSessionFactory(); //批量执行器 SqlSession sqlSession = sqlSessionFactory.openSession(ExecutorType.BATCH,false); try{ if(objs!=null){ for(int i=0,size=objs.size();i<size;i++){ sqlSession.update(str, objs.get(i)); } sqlSession.flushStatements(); sqlSession.commit(); sqlSession.clearCache(); } }finally{ sqlSession.close(); } } /** * 批量更新 * @param str * @param obj * @return * @throws Exception */ public Object batchDelete(String str, List objs )throws Exception{ return sqlSessionTemplate.delete(str, objs); } /** * 删除对象 * @param str * @param obj * @return * @throws Exception */ public Object delete(String str, Object obj) throws Exception { return sqlSessionTemplate.delete(str, obj); } /** * 查找对象 * @param str * @param obj * @return * @throws Exception */ public Object findForObject(String str, Object obj) throws Exception { return sqlSessionTemplate.selectOne(str, obj); } /** * 查找对象 * @param str * @param obj * @return * @throws Exception */ public Object findForList(String str, Object obj) throws Exception { return sqlSessionTemplate.selectList(str, obj); } public Object findForMap(String str, Object obj, String key, String value) throws Exception { return sqlSessionTemplate.selectMap(str, obj, key); } } ``` spring.xml ``` <?xml version="1.0" encoding="UTF-8"?> <beans xmlns="http://www.springframework.org/schema/beans" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:aop="http://www.springframework.org/schema/aop" xmlns:context="http://www.springframework.org/schema/context" xmlns:tx="http://www.springframework.org/schema/tx" xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd http://www.springframework.org/schema/aop http://www.springframework.org/schema/aop/spring-aop.xsd http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context.xsd http://www.springframework.org/schema/tx http://www.springframework.org/schema/tx/spring-tx.xsd "> <!-- 配置事务管理器 --> <bean name="transactionManager" class="org.springframework.jdbc.datasource.DataSourceTransactionManager"> <property name="dataSource" ref="dataSource"></property> </bean> <bean id="propertyConfigurer" class="org.springframework.beans.factory.config.PropertyPlaceholderConfigurer"> <property name="location" value="classpath:jdbc.properties"/> </bean> <!-- 阿里 druid数据库连接池 --> <bean id="dataSource" class="com.alibaba.druid.pool.DruidDataSource" destroy-method="close"> <!-- 数据库基本信息配置 --> <property name="driverClassName" value="${driverClassName}" /> <property name="url" value="${url}" /> <property name="username" value="${username}" /> <property name="password" value="${password}" /> <property name="filters" value="${filters}" /> <!-- 最大并发连接数 --> <property name="maxActive" value="${maxActive}" /> <!-- 初始化连接数量 --> <property name="initialSize" value="${initialSize}" /> <!-- 配置获取连接等待超时的时间 --> <property name="maxWait" value="${maxWait}" /> <!-- 最小空闲连接数 --> <property name="minIdle" value="${minIdle}" /> <!-- 配置间隔多久才进行一次检测,检测需要关闭的空闲连接,单位是毫秒 --> <property name="timeBetweenEvictionRunsMillis" value="${timeBetweenEvictionRunsMillis}" /> <!-- 配置一个连接在池中最小生存的时间,单位是毫秒 --> <property name="minEvictableIdleTimeMillis" value="${minEvictableIdleTimeMillis}" /> <property name="validationQuery" value="${validationQuery}" /> <property name="testWhileIdle" value="${testWhileIdle}" /> <property name="testOnBorrow" value="${testOnBorrow}" /> <property name="testOnReturn" value="${testOnReturn}" /> <property name="maxOpenPreparedStatements" value="${maxOpenPreparedStatements}" /> <!-- 打开removeAbandoned功能 --> <property name="removeAbandoned" value="${removeAbandoned}" /> <!-- 1800秒,也就是30分钟 --> <property name="removeAbandonedTimeout" value="${removeAbandonedTimeout}" /> <!-- 关闭abanded连接时输出错误日志 --> <property name="logAbandoned" value="${logAbandoned}" /> </bean> <!-- 启用注解 --> <context:annotation-config /> <!-- 启动组件扫描,排除@Controller组件,该组件由SpringMVC配置文件扫描 --> <context:component-scan base-package="com"> <context:exclude-filter type="annotation" expression="org.springframework.stereotype.Controller" /> </context:component-scan> <!-- 注解方式配置事务--> <!-- <tx:annotation-driven transaction-manager="transactionManager" /> --> <!-- 拦截器方式配置事务 --> <tx:advice id="txAdvice" transaction-manager="transactionManager"> <tx:attributes> <tx:method name="delete*" propagation="REQUIRED" read-only="false" rollback-for="java.lang.Exception"/> <tx:method name="insert*" propagation="REQUIRED" read-only="false" rollback-for="java.lang.Exception" /> <tx:method name="update*" propagation="REQUIRED" read-only="false" rollback-for="java.lang.Exception" /> <tx:method name="save*" propagation="REQUIRED" read-only="false" rollback-for="java.lang.Exception" /> <tx:method name="*" propagation="SUPPORTS"/> </tx:attributes> </tx:advice> <aop:aspectj-autoproxy proxy-target-class="true"/> <!-- 事物处理 --> <aop:config> <aop:pointcut id="pc" expression="execution(* com.*.service..*(..))" /> <aop:advisor pointcut-ref="pc" advice-ref="txAdvice" /> </aop:config> <!-- 配置mybatis --> <bean id="sqlSessionFactory" class="org.mybatis.spring.SqlSessionFactoryBean"> <property name="dataSource" ref="dataSource" /> <property name="configLocation" value="classpath:mybatis-config.xml"></property> <!-- mapper扫描 --> <property name="mapperLocations" value="classpath:mybatis/*/*.xml"></property> </bean> <bean id="sqlSessionTemplate" class="org.mybatis.spring.SqlSessionTemplate"> <constructor-arg ref="sqlSessionFactory" /> </bean> </beans> ```
corseek 中文检索时搜不出结果 搜英文单词正常
[root@abc testpack]# /usr/local/coreseek/bin/indexer -c etc/sphinx.conf --all Coreseek Fulltext 4.1 [ Sphinx 2.0.2-dev (r2922)] Copyright (c) 2007-2011, Beijing Choice Software Technologies Inc (http://www.coreseek.com) using config file 'etc/sphinx.conf'... indexing index 'test1'... WARNING: Attribute count is 0: switching to none docinfo collected 5 docs, 0.0 MB sorted 0.0 Mhits, 100.0% done total 5 docs, 186 bytes total 0.064 sec, 2870 bytes/sec, 77.16 docs/sec total 2 reads, 0.000 sec, 0.0 kb/call avg, 0.0 msec/call avg total 6 writes, 0.000 sec, 0.0 kb/call avg, 0.0 msec/call avg 检索中文 不出结果 [root@abc testpack]# /usr/local/coreseek/bin/search -c etc/sphinx.conf '水火不容' Coreseek Fulltext 4.1 [ Sphinx 2.0.2-dev (r2922)] Copyright (c) 2007-2011, Beijing Choice Software Technologies Inc (http://www.coreseek.com) using config file 'etc/sphinx.conf'... index 'test1': query '水火不容 ': returned 0 matches of 0 total in 0.000 sec words: 1. '水火': 0 documents, 0 hits 2. '不容': 0 documents, 0 hits 检索英文就能出结果 [root@abc testpack]# /usr/local/coreseek/bin/search -c etc/sphinx.conf 'apple' Coreseek Fulltext 4.1 [ Sphinx 2.0.2-dev (r2922)] Copyright (c) 2007-2011, Beijing Choice Software Technologies Inc (http://www.coreseek.com) using config file 'etc/sphinx.conf'... index 'test1': query 'apple ': returned 1 matches of 1 total in 0.001 sec displaying matches: 1. document=5, weight=2780 id=5 title=apple content=apple,banana words: 1. 'apple': 1 documents, 2 hits 这个是数据库 mysql> select * from tt; +----+--------------+-----------------+ | id | title | content | +----+--------------+-----------------+ | 1 | 西水 | 水水 | | 2 | 水火不容 | 水火不容 | | 3 | 水啊啊 | 啊水货 | | 4 | 东南西水 | 啊西西哈哈 | | 5 | apple | apple,banana | +----+--------------+-----------------+ 5 rows in set (0.00 sec) 下面是配置那个文件 # # Sphinx configuration file sample # # WARNING! While this sample file mentions all available options, # it contains (very) short helper descriptions only. Please refer to # doc/sphinx.html for details. # ############################################################################# ## data source definition ############################################################################# source src1 { # data source type. mandatory, no default value # known types are mysql, pgsql, mssql, xmlpipe, xmlpipe2, odbc type = mysql ##################################################################### ## SQL settings (for 'mysql' and 'pgsql' types) ##################################################################### # some straightforward parameters for SQL source types sql_host = localhost sql_user = root sql_pass = 123456 sql_db = haha sql_port = 3306 # optional, default is 3306 # UNIX socket name # optional, default is empty (reuse client library defaults) # usually '/var/lib/mysql/mysql.sock' on Linux # usually '/tmp/mysql.sock' on FreeBSD # sql_sock = /var/lib/mysql/mysql.sock # MySQL specific client connection flags # optional, default is 0 # # mysql_connect_flags = 32 # enable compression # MySQL specific SSL certificate settings # optional, defaults are empty # # mysql_ssl_cert = /etc/ssl/client-cert.pem # mysql_ssl_key = /etc/ssl/client-key.pem # mysql_ssl_ca = /etc/ssl/cacert.pem # MS SQL specific Windows authentication mode flag # MUST be in sync with charset_type index-level setting # optional, default is 0 # # mssql_winauth = 1 # use currently logged on user credentials # MS SQL specific Unicode indexing flag # optional, default is 0 (request SBCS data) # # mssql_unicode = 1 # request Unicode data from server # ODBC specific DSN (data source name) # mandatory for odbc source type, no default value # # odbc_dsn = DBQ=C:\data;DefaultDir=C:\data;Driver={Microsoft Text Driver (*.txt; *.csv)}; # sql_query = SELECT id, data FROM documents.csv # ODBC and MS SQL specific, per-column buffer sizes # optional, default is auto-detect # # sql_column_buffers = content=12M, comments=1M # pre-query, executed before the main fetch query # multi-value, optional, default is empty list of queries # sql_query_pre = SET NAMES utf8 sql_query_pre = SET SESSION query_cache_type=OFF # main document fetch query # mandatory, integer document ID field MUST be the first selected column sql_query = \ SELECT id, title, content FROM tt # joined/payload field fetch query # joined fields let you avoid (slow) JOIN and GROUP_CONCAT # payload fields let you attach custom per-keyword values (eg. for ranking) # # syntax is FIELD-NAME 'from' ( 'query' | 'payload-query' ); QUERY # joined field QUERY should return 2 columns (docid, text) # payload field QUERY should return 3 columns (docid, keyword, weight) # # REQUIRES that query results are in ascending document ID order! # multi-value, optional, default is empty list of queries # # sql_joined_field = tags from query; SELECT docid, CONCAT('tag',tagid) FROM tags ORDER BY docid ASC # sql_joined_field = wtags from payload-query; SELECT docid, tag, tagweight FROM tags ORDER BY docid ASC # file based field declaration # # content of this field is treated as a file name # and the file gets loaded and indexed in place of a field # # max file size is limited by max_file_field_buffer indexer setting # file IO errors are non-fatal and get reported as warnings # # sql_file_field = content_file_path # sql_query_info = SELECT * FROM tt WHERE id=$id # range query setup, query that must return min and max ID values # optional, default is empty # # sql_query will need to reference $start and $end boundaries # if using ranged query: # # sql_query = \ # SELECT doc.id, doc.id AS group, doc.title, doc.data \ # FROM documents doc \ # WHERE id>=$start AND id<=$end # # sql_query_range = SELECT MIN(id),MAX(id) FROM documents # range query step # optional, default is 1024 # # sql_range_step = 1000 # unsigned integer attribute declaration # multi-value (an arbitrary number of attributes is allowed), optional # optional bit size can be specified, default is 32 # # sql_attr_uint = author_id # sql_attr_uint = forum_id:9 # 9 bits for forum_id #sql_attr_uint = group_id # boolean attribute declaration # multi-value (an arbitrary number of attributes is allowed), optional # equivalent to sql_attr_uint with 1-bit size # # sql_attr_bool = is_deleted # bigint attribute declaration # multi-value (an arbitrary number of attributes is allowed), optional # declares a signed (unlike uint!) 64-bit attribute # # sql_attr_bigint = my_bigint_id # UNIX timestamp attribute declaration # multi-value (an arbitrary number of attributes is allowed), optional # similar to integer, but can also be used in date functions # # sql_attr_timestamp = posted_ts # sql_attr_timestamp = last_edited_ts #sql_attr_timestamp = date_added # string ordinal attribute declaration # multi-value (an arbitrary number of attributes is allowed), optional # sorts strings (bytewise), and stores their indexes in the sorted list # sorting by this attr is equivalent to sorting by the original strings # # sql_attr_str2ordinal = author_name # floating point attribute declaration # multi-value (an arbitrary number of attributes is allowed), optional # values are stored in single precision, 32-bit IEEE 754 format # # sql_attr_float = lat_radians # sql_attr_float = long_radians # multi-valued attribute (MVA) attribute declaration # multi-value (an arbitrary number of attributes is allowed), optional # MVA values are variable length lists of unsigned 32-bit integers # # syntax is ATTR-TYPE ATTR-NAME 'from' SOURCE-TYPE [;QUERY] [;RANGE-QUERY] # ATTR-TYPE is 'uint' or 'timestamp' # SOURCE-TYPE is 'field', 'query', or 'ranged-query' # QUERY is SQL query used to fetch all ( docid, attrvalue ) pairs # RANGE-QUERY is SQL query used to fetch min and max ID values, similar to 'sql_query_range' # # sql_attr_multi = uint tag from query; SELECT docid, tagid FROM tags # sql_attr_multi = uint tag from ranged-query; \ # SELECT docid, tagid FROM tags WHERE id>=$start AND id<=$end; \ # SELECT MIN(docid), MAX(docid) FROM tags # string attribute declaration # multi-value (an arbitrary number of these is allowed), optional # lets you store and retrieve strings # # sql_attr_string = stitle # wordcount attribute declaration # multi-value (an arbitrary number of these is allowed), optional # lets you count the words at indexing time # # sql_attr_str2wordcount = stitle # combined field plus attribute declaration (from a single column) # stores column as an attribute, but also indexes it as a full-text field # # sql_field_string = author # sql_field_str2wordcount = title # post-query, executed on sql_query completion # optional, default is empty # # sql_query_post = # post-index-query, executed on successful indexing completion # optional, default is empty # $maxid expands to max document ID actually fetched from DB # # sql_query_post_index = REPLACE INTO counters ( id, val ) \ # VALUES ( 'max_indexed_id', $maxid ) # ranged query throttling, in milliseconds # optional, default is 0 which means no delay # enforces given delay before each query step sql_ranged_throttle = 0 # document info query, ONLY for CLI search (ie. testing and debugging) # optional, default is empty # must contain $id macro and must fetch the document by that id sql_query_info = SELECT * FROM tt WHERE id=$id # kill-list query, fetches the document IDs for kill-list # k-list will suppress matches from preceding indexes in the same query # optional, default is empty # # sql_query_killlist = SELECT id FROM documents WHERE edited>=@last_reindex # columns to unpack on indexer side when indexing # multi-value, optional, default is empty list # # unpack_zlib = zlib_column # unpack_mysqlcompress = compressed_column # unpack_mysqlcompress = compressed_column_2 # maximum unpacked length allowed in MySQL COMPRESS() unpacker # optional, default is 16M # # unpack_mysqlcompress_maxsize = 16M ##################################################################### ## xmlpipe2 settings ##################################################################### # type = xmlpipe # shell command to invoke xmlpipe stream producer # mandatory # # xmlpipe_command = cat /usr/local/coreseek/var/test.xml # xmlpipe2 field declaration # multi-value, optional, default is empty # # xmlpipe_field = subject # xmlpipe_field = content # xmlpipe2 attribute declaration # multi-value, optional, default is empty # all xmlpipe_attr_XXX options are fully similar to sql_attr_XXX # # xmlpipe_attr_timestamp = published # xmlpipe_attr_uint = author_id # perform UTF-8 validation, and filter out incorrect codes # avoids XML parser choking on non-UTF-8 documents # optional, default is 0 # # xmlpipe_fixup_utf8 = 1 } # inherited source example # # all the parameters are copied from the parent source, # and may then be overridden in this source definition source src1throttled : src1 { sql_ranged_throttle = 100 } ############################################################################# ## index definition ############################################################################# # local index example # # this is an index which is stored locally in the filesystem # # all indexing-time options (such as morphology and charsets) # are configured per local index index test1 { # index type # optional, default is 'plain' # known values are 'plain', 'distributed', and 'rt' (see samples below) # type = plain # document source(s) to index # multi-value, mandatory # document IDs must be globally unique across all sources source = src1 # index files path and file name, without extension # mandatory, path must be writable, extensions will be auto-appended #path = /usr/local/coreseek/var/data/test1 # document attribute values (docinfo) storage mode # optional, default is 'extern' # known values are 'none', 'extern' and 'inline' docinfo = extern # memory locking for cached data (.spa and .spi), to prevent swapping # optional, default is 0 (do not mlock) # requires searchd to be run from root mlock = 0 # a list of morphology preprocessors to apply # optional, default is empty # # builtin preprocessors are 'none', 'stem_en', 'stem_ru', 'stem_enru', # 'soundex', and 'metaphone'; additional preprocessors available from # libstemmer are 'libstemmer_XXX', where XXX is algorithm code # (see libstemmer_c/libstemmer/modules.txt) # # morphology = stem_en, stem_ru, soundex # morphology = libstemmer_german # morphology = libstemmer_sv morphology = none # minimum word length at which to enable stemming # optional, default is 1 (stem everything) # # min_stemming_len = 1 path = /root/rearch_dir # stopword files list (space separated) # optional, default is empty # contents are plain text, charset_table and stemming are both applied # # stopwords = /usr/local/coreseek/var/data/stopwords.txt # wordforms file, in "mapfrom > mapto" plain text format # optional, default is empty # # wordforms = /usr/local/coreseek/var/data/wordforms.txt # tokenizing exceptions file # optional, default is empty # # plain text, case sensitive, space insensitive in map-from part # one "Map Several Words => ToASingleOne" entry per line # # exceptions = /usr/local/coreseek/var/data/exceptions.txt # minimum indexed word length # default is 1 (index everything) min_word_len = 1 # charset encoding type # optional, default is 'sbcs' # known types are 'sbcs' (Single Byte CharSet) and 'utf-8' charset_type = zh_cn.utf-8 charset_dictpath = /usr/local/mmseg3/etc/ # charset definition and case folding rules "table" # optional, default value depends on charset_type # # defaults are configured to include English and Russian characters only # you need to change the table to include additional ones # this behavior MAY change in future versions # # 'sbcs' default value is # charset_table = 0..9, A..Z->a..z, _, a..z, U+A8->U+B8, U+B8, U+C0..U+DF->U+E0..U+FF, U+E0..U+FF # # 'utf-8' default value is #charset_table = 0..9, A..Z->a..z, _, a..z, U+410..U+42F->U+430..U+44F, U+430..U+44F # ignored characters list # optional, default value is empty # # ignore_chars = U+00AD # minimum word prefix length to index # optional, default is 0 (do not index prefixes) # # min_prefix_len = 0 # minimum word infix length to index # optional, default is 0 (do not index infixes) # # min_infix_len = 0 # list of fields to limit prefix/infix indexing to # optional, default value is empty (index all fields in prefix/infix mode) # # prefix_fields = filename # infix_fields = url, domain # enable star-syntax (wildcards) when searching prefix/infix indexes # search-time only, does not affect indexing, can be 0 or 1 # optional, default is 0 (do not use wildcard syntax) # # enable_star = 1 # expand keywords with exact forms and/or stars when searching fit indexes # search-time only, does not affect indexing, can be 0 or 1 # optional, default is 0 (do not expand keywords) # # expand_keywords = 1 # n-gram length to index, for CJK indexing # only supports 0 and 1 for now, other lengths to be implemented # optional, default is 0 (disable n-grams) # ngram_len = 0 # n-gram characters list, for CJK indexing # optional, default is empty # # ngram_chars = U+3000..U+2FA1F # phrase boundary characters list # optional, default is empty # # phrase_boundary = ., ?, !, U+2026 # horizontal ellipsis # phrase boundary word position increment # optional, default is 0 # # phrase_boundary_step = 100 # blended characters list # blended chars are indexed both as separators and valid characters # for instance, AT&T will results in 3 tokens ("at", "t", and "at&t") # optional, default is empty # # blend_chars = +, &, U+23 # blended token indexing mode # a comma separated list of blended token indexing variants # known variants are trim_none, trim_head, trim_tail, trim_both, skip_pure # optional, default is trim_none # # blend_mode = trim_tail, skip_pure # whether to strip HTML tags from incoming documents # known values are 0 (do not strip) and 1 (do strip) # optional, default is 0 html_strip = 0 # what HTML attributes to index if stripping HTML # optional, default is empty (do not index anything) # # html_index_attrs = img=alt,title; a=title; # what HTML elements contents to strip # optional, default is empty (do not strip element contents) # # html_remove_elements = style, script # whether to preopen index data files on startup # optional, default is 0 (do not preopen), searchd-only # # preopen = 1 # whether to keep dictionary (.spi) on disk, or cache it in RAM # optional, default is 0 (cache in RAM), searchd-only # # ondisk_dict = 1 # whether to enable in-place inversion (2x less disk, 90-95% speed) # optional, default is 0 (use separate temporary files), indexer-only # # inplace_enable = 1 # in-place fine-tuning options # optional, defaults are listed below # # inplace_hit_gap = 0 # preallocated hitlist gap size # inplace_docinfo_gap = 0 # preallocated docinfo gap size # inplace_reloc_factor = 0.1 # relocation buffer size within arena # inplace_write_factor = 0.1 # write buffer size within arena # whether to index original keywords along with stemmed versions # enables "=exactform" operator to work # optional, default is 0 # # index_exact_words = 1 # position increment on overshort (less that min_word_len) words # optional, allowed values are 0 and 1, default is 1 # # overshort_step = 1 # position increment on stopword # optional, allowed values are 0 and 1, default is 1 # # stopword_step = 1 # hitless words list # positions for these keywords will not be stored in the index # optional, allowed values are 'all', or a list file name # # hitless_words = all # hitless_words = hitless.txt # detect and index sentence and paragraph boundaries # required for the SENTENCE and PARAGRAPH operators to work # optional, allowed values are 0 and 1, default is 0 # # index_sp = 1 # index zones, delimited by HTML/XML tags # a comma separated list of tags and wildcards # required for the ZONE operator to work # optional, default is empty string (do not index zones) # # index_zones = title, h*, th } # inherited index example # # all the parameters are copied from the parent index, # and may then be overridden in this index definition #index test1stemmed : test1 #{ # path = /usr/local/coreseek/var/data/test1stemmed # morphology = stem_en #} # distributed index example # # this is a virtual index which can NOT be directly indexed, # and only contains references to other local and/or remote indexes #index dist1 #{ # 'distributed' index type MUST be specified # type = distributed # local index to be searched # there can be many local indexes configured # local = test1 # local = test1stemmed # remote agent # multiple remote agents may be specified # syntax for TCP connections is 'hostname:port:index1,[index2[,...]]' # syntax for local UNIX connections is '/path/to/socket:index1,[index2[,...]]' # agent = localhost:9313:remote1 # agent = localhost:9314:remote2,remote3 # agent = /var/run/searchd.sock:remote4 # blackhole remote agent, for debugging/testing # network errors and search results will be ignored # # agent_blackhole = testbox:9312:testindex1,testindex2 # remote agent connection timeout, milliseconds # optional, default is 1000 ms, ie. 1 sec # agent_connect_timeout = 1000 # remote agent query timeout, milliseconds # optional, default is 3000 ms, ie. 3 sec # agent_query_timeout = 3000 #} # realtime index example # # you can run INSERT, REPLACE, and DELETE on this index on the fly # using MySQL protocol (see 'listen' directive below) #index rt #{ # 'rt' index type must be specified to use RT index #type = rt # index files path and file name, without extension # mandatory, path must be writable, extensions will be auto-appended # path = /usr/local/coreseek/var/data/rt # RAM chunk size limit # RT index will keep at most this much data in RAM, then flush to disk # optional, default is 32M # # rt_mem_limit = 512M # full-text field declaration # multi-value, mandatory # rt_field = title # rt_field = content # unsigned integer attribute declaration # multi-value (an arbitrary number of attributes is allowed), optional # declares an unsigned 32-bit attribute # rt_attr_uint = gid # RT indexes currently support the following attribute types: # uint, bigint, float, timestamp, string # # rt_attr_bigint = guid # rt_attr_float = gpa # rt_attr_timestamp = ts_added # rt_attr_string = content #} ############################################################################# ## indexer settings ############################################################################# indexer { # memory limit, in bytes, kiloytes (16384K) or megabytes (256M) # optional, default is 32M, max is 2047M, recommended is 256M to 1024M mem_limit = 256M # maximum IO calls per second (for I/O throttling) # optional, default is 0 (unlimited) # # max_iops = 40 # maximum IO call size, bytes (for I/O throttling) # optional, default is 0 (unlimited) # # max_iosize = 1048576 # maximum xmlpipe2 field length, bytes # optional, default is 2M # # max_xmlpipe2_field = 4M # write buffer size, bytes # several (currently up to 4) buffers will be allocated # write buffers are allocated in addition to mem_limit # optional, default is 1M # # write_buffer = 1M # maximum file field adaptive buffer size # optional, default is 8M, minimum is 1M # # max_file_field_buffer = 32M } ############################################################################# ## searchd settings ############################################################################# searchd { # [hostname:]port[:protocol], or /unix/socket/path to listen on # known protocols are 'sphinx' (SphinxAPI) and 'mysql41' (SphinxQL) # # multi-value, multiple listen points are allowed # optional, defaults are 9312:sphinx and 9306:mysql41, as below # # listen = 127.0.0.1 # listen = 192.168.0.1:9312 # listen = 9312 # listen = /var/run/searchd.sock listen = 9312 #listen = 9306:mysql41 # log file, searchd run info is logged here # optional, default is 'searchd.log' log = /usr/local/coreseek/var/log/searchd.log # query log file, all search queries are logged here # optional, default is empty (do not log queries) query_log = /usr/local/coreseek/var/log/query.log # client read timeout, seconds # optional, default is 5 read_timeout = 5 # request timeout, seconds # optional, default is 5 minutes client_timeout = 300 # maximum amount of children to fork (concurrent searches to run) # optional, default is 0 (unlimited) max_children = 30 # PID file, searchd process ID file name # mandatory pid_file = /usr/local/coreseek/var/log/searchd.pid # max amount of matches the daemon ever keeps in RAM, per-index # WARNING, THERE'S ALSO PER-QUERY LIMIT, SEE SetLimits() API CALL # default is 1000 (just like Google) max_matches = 1000 # seamless rotate, prevents rotate stalls if precaching huge datasets # optional, default is 1 seamless_rotate = 1 # whether to forcibly preopen all indexes on startup # optional, default is 1 (preopen everything) preopen_indexes = 0 # whether to unlink .old index copies on succesful rotation. # optional, default is 1 (do unlink) unlink_old = 1 # attribute updates periodic flush timeout, seconds # updates will be automatically dumped to disk this frequently # optional, default is 0 (disable periodic flush) # # attr_flush_period = 900 # instance-wide ondisk_dict defaults (per-index value take precedence) # optional, default is 0 (precache all dictionaries in RAM) # # ondisk_dict_default = 1 # MVA updates pool size # shared between all instances of searchd, disables attr flushes! # optional, default size is 1M mva_updates_pool = 1M # max allowed network packet size # limits both query packets from clients, and responses from agents # optional, default size is 8M max_packet_size = 8M # crash log path # searchd will (try to) log crashed query to 'crash_log_path.PID' file # optional, default is empty (do not create crash logs) # # crash_log_path = /usr/local/coreseek/var/log/crash # max allowed per-query filter count # optional, default is 256 max_filters = 256 # max allowed per-filter values count # optional, default is 4096 max_filter_values = 4096 # socket listen queue length # optional, default is 5 # # listen_backlog = 5 # per-keyword read buffer size # optional, default is 256K # # read_buffer = 256K # unhinted read size (currently used when reading hits) # optional, default is 32K # # read_unhinted = 32K # max allowed per-batch query count (aka multi-query count) # optional, default is 32 max_batch_queries = 32 # max common subtree document cache size, per-query # optional, default is 0 (disable subtree optimization) # # subtree_docs_cache = 4M # max common subtree hit cache size, per-query # optional, default is 0 (disable subtree optimization) # # subtree_hits_cache = 8M # multi-processing mode (MPM) # known values are none, fork, prefork, and threads # optional, default is fork # workers = threads # for RT to work # max threads to create for searching local parts of a distributed index # optional, default is 0, which means disable multi-threaded searching # should work with all MPMs (ie. does NOT require workers=threads) # # dist_threads = 4 # binlog files path; use empty string to disable binlog # optional, default is build-time configured data directory # # binlog_path = # disable logging # binlog_path = /usr/local/coreseek/var/data # binlog.001 etc will be created there # binlog flush/sync mode # 0 means flush and sync every second # 1 means flush and sync every transaction # 2 means flush every transaction, sync every second # optional, default is 2 # # binlog_flush = 2 # binlog per-file size limit # optional, default is 128M, 0 means no limit # # binlog_max_log_size = 256M # per-thread stack size, only affects workers=threads mode # optional, default is 64K # # thread_stack = 128K # per-keyword expansion limit (for dict=keywords prefix searches) # optional, default is 0 (no limit) # # expansion_limit = 1000 # RT RAM chunks flush period # optional, default is 0 (no periodic flush) # # rt_flush_period = 900 # query log file format # optional, known values are plain and sphinxql, default is plain # # query_log_format = sphinxql # version string returned to MySQL network protocol clients # optional, default is empty (use Sphinx version) # # mysql_version_string = 5.0.37 # trusted plugin directory # optional, default is empty (disable UDFs) # # plugin_dir = /usr/local/sphinx/lib # default server-wide collation # optional, default is libc_ci # # collation_server = utf8_general_ci # server-wide locale for libc based collations # optional, default is C # # collation_libc_locale = ru_RU.UTF-8 # threaded server watchdog (only used in workers=threads mode) # optional, values are 0 and 1, default is 1 (watchdog on) # # watchdog = 1 # SphinxQL compatibility mode (legacy columns and their names) # optional, default is 0 (SQL compliant syntax and result sets) # # compat_sphinxql_magics = 1 } # --eof-- 求救一下 不知道哪里错了 中文搜不出结果来
KSVD程序中要求信号Y的列数要大于字典D的列数,不知道为什么有这个要求。
用KSVD.m训练字典,程序中有一段要求信号Y的列数要大于字典D的列数,否则就会报错,但是理论上并不需要这个要求,帮忙解释下为什么会有这个要求?(程序第91行)如果Y是n*1的向量,就不能训练出字典了吗? ``` function [Dictionary,output] = KSVD(... Data,... % an nXN matrix that contins N signals (Y), each of dimension n. param) % ========================================================================= % K-SVD algorithm % ========================================================================= % The K-SVD algorithm finds a dictionary for linear representation of % signals. Given a set of signals, it searches for the best dictionary that % can sparsely represent each signal. Detailed discussion on the algorithm % and possible applications can be found in "The K-SVD: An Algorithm for % Designing of Overcomplete Dictionaries for Sparse Representation", written % by M. Aharon, M. Elad, and A.M. Bruckstein and appeared in the IEEE Trans. % On Signal Processing, Vol. 54, no. 11, pp. 4311-4322, November 2006. % ========================================================================= % INPUT ARGUMENTS: % Data an nXN matrix that contins N signals (Y), each of dimension n. % param structure that includes all required % parameters for the K-SVD execution. % Required fields are: % K, ... the number of dictionary elements to train % K 原子个数 % numIteration,... number of iterations to perform. % numIteration 迭代次数 % errorFlag... if =0, a fix number of coefficients is % used for representation of each signal. If so, param.L must be % specified as the number of representing atom. if =1, arbitrary number % of atoms represent each signal, until a specific representation error % is reached. If so, param.errorGoal must be specified as the allowed % error. % preserveDCAtom... if =1 then the first atom in the dictionary % is set to be constant, and does not ever change. This % might be useful for working with natural % images (in this case, only param.K-1 % atoms are trained). % (optional, see errorFlag) L,... % maximum coefficients to use in OMP coefficient calculations. % (optional, see errorFlag) errorGoal, ... % allowed representation error in representing each signal. % InitializationMethod,... mehtod to initialize the dictionary, can % be one of the following arguments: % * 'DataElements' (initialization by the signals themselves), or: % * 'GivenMatrix' (initialization by a given matrix param.initialDictionary). % (optional, see InitializationMethod) initialDictionary,... % if the initialization method % is 'GivenMatrix', this is the matrix that will be used. % (optional) TrueDictionary, ... % if specified, in each % iteration the difference between this dictionary and the trained one % is measured and displayed. % displayProgress, ... if =1 progress information is displyed. If param.errorFlag==0, % the average repersentation error (RMSE) is displayed, while if % param.errorFlag==1, the average number of required coefficients for % representation of each signal is displayed. % ========================================================================= % OUTPUT ARGUMENTS: % Dictionary The extracted dictionary of size nX(param.K). % output Struct that contains information about the current run. It may include the following fields: % CoefMatrix The final coefficients matrix (it should hold that Data equals approximately Dictionary*output.CoefMatrix. % ratio If the true dictionary was defined (in % synthetic experiments), this parameter holds a vector of length % param.numIteration that includes the detection ratios in each % iteration). % totalerr The total representation error after each % iteration (defined only if % param.displayProgress=1 and % param.errorFlag = 0) % numCoef A vector of length param.numIteration that % include the average number of coefficients required for representation % of each signal (in each iteration) (defined only if % param.displayProgress=1 and % param.errorFlag = 1) % ========================================================================= if (~isfield(param,'displayProgress')) param.displayProgress = 0; end totalerr(1) = 99999; if (isfield(param,'errorFlag')==0) param.errorFlag = 0; end if (isfield(param,'TrueDictionary')) displayErrorWithTrueDictionary = 1; ErrorBetweenDictionaries = zeros(param.numIteration+1,1); ratio = zeros(param.numIteration+1,1); else displayErrorWithTrueDictionary = 0; ratio = 0; end if (param.preserveDCAtom>0) FixedDictionaryElement(1:size(Data,1),1) = 1/sqrt(size(Data,1)); else FixedDictionaryElement = []; end % coefficient calculation method is OMP with fixed number of coefficients if (size(Data,2) < param.K)% 问题在这里,Data的列小于K就不能运行 disp('Size of data is smaller than the dictionary size. Trivial solution...'); Dictionary = Data(:,1:size(Data,2)); return; elseif (strcmp(param.InitializationMethod,'DataElements')) Dictionary(:,1:param.K-param.preserveDCAtom) = Data(:,1:param.K-param.preserveDCAtom); elseif (strcmp(param.InitializationMethod,'GivenMatrix')) Dictionary(:,1:param.K-param.preserveDCAtom) = param.initialDictionary(:,1:param.K-param.preserveDCAtom); end % reduce the components in Dictionary that are spanned by the fixed % elements if (param.preserveDCAtom) tmpMat = FixedDictionaryElement \ Dictionary; Dictionary = Dictionary - FixedDictionaryElement*tmpMat; end %normalize the dictionary. 对字典进行归一化 Dictionary = Dictionary*diag(1./sqrt(sum(Dictionary.*Dictionary))); Dictionary = Dictionary.*repmat(sign(Dictionary(1,:)),size(Dictionary,1),1); totalErr = zeros(1,param.numIteration); %% % the K-SVD algorithm starts here. for iterNum = 1:param.numIteration %param.numIteration = numIterOfKsvd=10 % find the coefficients if (param.errorFlag==0) %param.errorFlag = 1; %CoefMatrix = mexOMPIterative2(Data, [FixedDictionaryElement,Dictionary],param.L); CoefMatrix = OMP([FixedDictionaryElement,Dictionary],Data, param.L); %size(Data,2)=249*249 else %CoefMatrix = mexOMPerrIterative(Data, [FixedDictionaryElement,Dictionary],param.errorGoal); CoefMatrix = OMPerr([FixedDictionaryElement,Dictionary],Data, param.errorGoal);%%%%%%%%%%param.errorGoal = sigma*C; 稀疏矩阵 param.L = 1; end replacedVectorCounter = 0; rPerm = randperm(size(Dictionary,2)); for j = rPerm [betterDictionaryElement,CoefMatrix,addedNewVector] = I_findBetterDictionaryElement(Data,... [FixedDictionaryElement,Dictionary],j+size(FixedDictionaryElement,2),... CoefMatrix,param.L); Dictionary(:,j) = betterDictionaryElement; if (param.preserveDCAtom) tmpCoef = FixedDictionaryElement\betterDictionaryElement; Dictionary(:,j) = betterDictionaryElement - FixedDictionaryElement*tmpCoef; Dictionary(:,j) = Dictionary(:,j)./sqrt(Dictionary(:,j)'*Dictionary(:,j)); end replacedVectorCounter = replacedVectorCounter+addedNewVector; end if (iterNum>1 & param.displayProgress) if (param.errorFlag==0) output.totalerr(iterNum-1) = sqrt(sum(sum((Data-[FixedDictionaryElement,Dictionary]*CoefMatrix).^2))/prod(size(Data))); disp(['Iteration ',num2str(iterNum),' Total error is: ',num2str(output.totalerr(iterNum-1))]); else %执行此句 output.numCoef(iterNum-1) = length(find(CoefMatrix))/size(Data,2); disp(['Iteration ',num2str(iterNum),' Average number of coefficients: ',num2str(output.numCoef(iterNum-1))]); end end if (displayErrorWithTrueDictionary ) [ratio(iterNum+1),ErrorBetweenDictionaries(iterNum+1)] = I_findDistanseBetweenDictionaries(param.TrueDictionary,Dictionary);%%%%%% disp(strcat(['Iteration ', num2str(iterNum),' ratio of restored elements: ',num2str(ratio(iterNum+1))])); output.ratio = ratio; end Dictionary = I_clearDictionary(Dictionary,CoefMatrix(size(FixedDictionaryElement,2)+1:end,:),Data); if (isfield(param,'waitBarHandle')) waitbar(iterNum/param.counterForWaitBar); end end output.CoefMatrix = CoefMatrix; Dictionary = [FixedDictionaryElement,Dictionary]; function [betterDictionaryElement,CoefMatrix,NewVectorAdded] = I_findBetterDictionaryElement(Data,Dictionary,j,CoefMatrix,numCoefUsed) if (length(who('numCoefUsed'))==0) numCoefUsed = 1; % liu=1%%%%没有进行此句,说明if条件不满足。 end relevantDataIndices = find(CoefMatrix(j,:)); % the data indices that uses the j'th dictionary element. 查找出系数矩阵中每一行中非0元素的序号 参考DCT字典的程序:relevantDataIndices = find(Coefs(3,:)); if (length(relevantDataIndices)<1) %(length(relevantDataIndices)==0) 如果系数矩阵为空,则进行如下的语句 。 如果relevantDataIndices为0,说明没有patch表达粗腰用到第j个原子 ErrorMat = Data-Dictionary*CoefMatrix; ErrorNormVec = sum(ErrorMat.^2); [d,i] = max(ErrorNormVec); betterDictionaryElement = Data(:,i);%ErrorMat(:,i); % betterDictionaryElement = betterDictionaryElement./sqrt(betterDictionaryElement'*betterDictionaryElement);%归一化 betterDictionaryElement = betterDictionaryElement.*sign(betterDictionaryElement(1)); CoefMatrix(j,:) = 0; NewVectorAdded = 1%%%%%实验证明(针对w.jpg图像),值累加了一次 % liuzhe=1 没进行此句,说明稀疏矩阵的每一行都有非零的元素 return; end NewVectorAdded = 0; tmpCoefMatrix = CoefMatrix(:,relevantDataIndices); %将稀疏矩阵中非0 的取出来 tmpCoefMatrix尺寸为:256*length(relevantDataIndices) tmpCoefMatrix(j,:) = 0;% the coeffitients of the element we now improve are not relevant. errors =(Data(:,relevantDataIndices) - Dictionary*tmpCoefMatrix); % vector of errors that we want to minimize with the new element D:64*256 tmpCoefMatrix尺寸为:256*length(relevantDataIndices) Data(:,relevantDataIndices):64*relevantDataIndices % % the better dictionary element and the values of beta are found using svd. % % This is because we would like to minimize || errors - beta*element ||_F^2. % % that is, to approximate the matrix 'errors' with a one-rank matrix. This % % is done using the largest singular value. %%在这里使用SVD就可以达到|| errors - beta*element ||_F^2误差最小的效果 [betterDictionaryElement,singularValue,betaVector] = svds(errors,1);%%%%%%%仅仅取出了第一主分量 errors的大小为;64*relevantDataIndices M=64 N=relevantDataIndices betterDictionaryElement*singularValue*betaVector'近似的可以表示errors %a=[1 2 3 4;5 6 7 8;9 10 11 12;2 4 6 7.99999]; [u,s,v]=svds(a) u*s*v' [u,s,v]=svds(a,1):取出的第一主成分 %对于svds函数:a为M*N的矩阵,那么u:M*M S:M*N(简写成M*M) V=N*M V'=M*N %对于svd函数:a为M*N的矩阵, 那么u:M*M S:M*N V=N*N V'=N*N %将字典原子D的解定义为U中的第一列,将系数向量CoefMatrix的解定义为V的第一列与S(1,1)的乘积 这个是核心 核心 核心!!!!!!!!!!!!!!! CoefMatrix(j,relevantDataIndices) = singularValue*betaVector';% *signOfFirstElem s*v' [u,s,v]=svds(a,1):取出的第一主成分 ,所以此时s*v'矩阵大小为 1*N,即CoefMatrix(j,relevantDataIndices)也为:1*N betterDictionaryElement:M*1,即64*1的向量 %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% % findDistanseBetweenDictionaries %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% function [ratio,totalDistances] = I_findDistanseBetweenDictionaries(original,new) % first, all the column in oiginal starts with positive values. catchCounter = 0; totalDistances = 0; for i = 1:size(new,2) new(:,i) = sign(new(1,i))*new(:,i); end for i = 1:size(original,2) d = sign(original(1,i))*original(:,i); distances =sum ( (new-repmat(d,1,size(new,2))).^2); [minValue,index] = min(distances); errorOfElement = 1-abs(new(:,index)'*d); totalDistances = totalDistances+errorOfElement; catchCounter = catchCounter+(errorOfElement<0.01); end ratio = 100*catchCounter/size(original,2); %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% % I_clearDictionary %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% function Dictionary = I_clearDictionary(Dictionary,CoefMatrix,Data) T2 = 0.99; T1 = 3; K=size(Dictionary,2); %%K=256 Er=sum((Data-Dictionary*CoefMatrix).^2,1); % remove identical atoms(删除相同的原子) 列求和 CoefMatrix(j,relevantDataIndices)的大小为256*relevantDataIndices G=Dictionary'*Dictionary; %256*256 G表示不同的原子求内积,可以认为是计算相似性 G 的大小是 K*K G = G-diag(diag(G));%例如:G=magic(3) diag(diag(G)) 也就是将对角的元素赋值为0 for jj=1:1:K, if max(G(jj,:))>T2 | length(find(abs(CoefMatrix(jj,:))>1e-7))<=T1 , %G(jj,:))>T2 表示两个原子间相似性很高,大于0.99 %length(find(abs(CoefMatrix(jj,:))>1e-7) 表示这使用到第jj个原子的patch少于3个 [val,pos]=max(Er); clearDictionary=1%%%%%%%%%%%%%%%%%%%%%%%%测试满足if条件的有多少次 Er(pos(1))=0;%将最大误差处的值赋值为0 Dictionary(:,jj)=Data(:,pos(1))/norm(Data(:,pos(1)));%%norm(Data(:,pos(1)):求向量的模 此整句相当于把误差最大的列归一化 G=Dictionary'*Dictionary; G = G-diag(diag(G)); end; end; ```
每次都自动执行到master数据库,能默认执行到我创建的数据库吗,
1 每次都自动执行到master数据库,能默认执行到我创建的数据库吗, 每次都要切换创建的数据库吗,不智能。执行到master数据库可以撤销吗,恢复master 2 老是语法错误啊 sql没问题的 是不是 master数据库弄坏了 ``` USE [fhadmin] GO /****** Object: Database [fhadmin] Script Date: 2016/2/4 11:00:37 ******/ CREATE DATABASE [fhadmin] CONTAINMENT = NONE ON PRIMARY ( NAME = N'fhadmin', FILENAME = N'D:\Microsoft SQL Server\MSSQL10_50.MSSQLSERVER\MSSQL\DATA\fhadmin.mdf' , SIZE = 5120KB , MAXSIZE = UNLIMITED, FILEGROWTH = 1024KB ) LOG ON ( NAME = N'fhadmin_log', FILENAME = N'D:\Microsoft SQL Server\MSSQL10_50.MSSQLSERVER\MSSQL\DATA\fhadmin_log.ldf' , SIZE = 1024KB , MAXSIZE = 2048GB , FILEGROWTH = 10%) GO ALTER DATABASE [fhadmin] SET COMPATIBILITY_LEVEL = 110 GO IF (1 = FULLTEXTSERVICEPROPERTY('IsFullTextInstalled')) begin EXEC [fhadmin].[dbo].[sp_fulltext_database] @action = 'enable' en' GO ALTER DATABASE [fhadmin] SET ANSI_NULL_DEFAULT OFF GO ALTER DATABASE [fhadmin] SET ANSI_NULLS OFF GO ALTER DATABASE [fhadmin] SET ANSI_PADDING OFF GO ALTER DATABASE [fhadmin] SET ANSI_WARNINGS OFF GO ALTER DATABASE [fhadmin] SET ARITHABORT OFF GO ALTER DATABASE [fhadmin] SET AUTO_CLOSE OFF GO ALTER DATABASE [fhadmin] SET AUTO_CREATE_STATISTICS ON GO ALTER DATABASE [fhadmin] SET AUTO_SHRINK OFF GO ALTER DATABASE [fhadmin] SET AUTO_UPDATE_STATISTICS ON GO ALTER DATABASE [fhadmin] SET CURSOR_CLOSE_ON_COMMIT OFF GO ALTER DATABASE [fhadmin] SET CURSOR_DEFAULT GLOBAL GO ALTER DATABASE [fhadmin] SET CONCAT_NULL_YIELDS_NULL OFF GO ALTER DATABASE [fhadmin] SET NUMERIC_ROUNDABORT OFF GO ALTER DATABASE [fhadmin] SET QUOTED_IDENTIFIER OFF GO ALTER DATABASE [fhadmin] SET RECURSIVE_TRIGGERS OFF GO ALTER DATABASE [fhadmin] SET DISABLE_BROKER GO ALTER DATABASE [fhadmin] SET AUTO_UPDATE_STATISTICS_ASYNC OFF GO ALTER DATABASE [fhadmin] SET DATE_CORRELATION_OPTIMIZATION OFF GO ALTER DATABASE [fhadmin] SET TRUSTWORTHY OFF GO ALTER DATABASE [fhadmin] SET ALLOW_SNAPSHOT_ISOLATION OFF GO ALTER DATABASE [fhadmin] SET PARAMETERIZATION SIMPLE GO ALTER DATABASE [fhadmin] SET READ_COMMITTED_SNAPSHOT OFF GO ALTER DATABASE [fhadmin] SET HONOR_BROKER_PRIORITY OFF GO ALTER DATABASE [fhadmin] SET RECOVERY FULL GO ALTER DATABASE [fhadmin] SET MULTI_USER GO ALTER DATABASE [fhadmin] SET PAGE_VERIFY CHECKSUM GO ALTER DATABASE [fhadmin] SET DB_CHAINING OFF GO ALTER DATABASE [fhadmin] SET FILESTREAM( NON_TRANSACTED_ACCESS = OFF ) GO ALTER DATABASE [fhadmin] SET TARGET_RECOVERY_TIME = 0 SECONDS GO EXEC sys.sp_db_vardecimal_storage_format N'fhadmin', N'ON' GO USE [fhadmin] GO /****** Object: Table [dbo].[FH_TESTFH] Script Date: 2016/2/4 11:00:37 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO CREATE TABLE [dbo].[FH_TESTFH]( [TESTFH_ID] [nvarchar](100) NOT NULL, [NAME] [nvarchar](255) NULL, [AGE] [int] NOT NULL, [BIRTHDAY] [nvarchar](32) NULL, PRIMARY KEY CLUSTERED ( [TESTFH_ID] ASC )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY] ) ON [PRIMARY] GO /****** Object: Table [dbo].[sys_app_user] Script Date: 2016/2/4 11:00:37 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO CREATE TABLE [dbo].[sys_app_user]( [USER_ID] [nvarchar](100) NOT NULL, [USERNAME] [nvarchar](255) NULL, [PASSWORD] [nvarchar](255) NULL, [NAME] [nvarchar](255) NULL, [RIGHTS] [nvarchar](255) NULL, [ROLE_ID] [nvarchar](100) NULL, [LAST_LOGIN] [nvarchar](255) NULL, [IP] [nvarchar](100) NULL, [STATUS] [nvarchar](32) NULL, [BZ] [nvarchar](255) NULL, [PHONE] [nvarchar](100) NULL, [SFID] [nvarchar](100) NULL, [START_TIME] [nvarchar](100) NULL, [END_TIME] [nvarchar](100) NULL, [YEARS] [int] NULL, [NUMBER] [nvarchar](100) NULL, [EMAIL] [nvarchar](32) NULL, PRIMARY KEY CLUSTERED ( [USER_ID] ASC )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY] ) ON [PRIMARY] GO /****** Object: Table [dbo].[sys_createcode] Script Date: 2016/2/4 11:00:37 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO CREATE TABLE [dbo].[sys_createcode]( [CREATECODE_ID] [nvarchar](100) NOT NULL, [PACKAGENAME] [nvarchar](50) NULL, [OBJECTNAME] [nvarchar](50) NULL, [TABLENAME] [nvarchar](50) NULL, [FIELDLIST] [nvarchar](4000) NULL, [CREATETIME] [nvarchar](100) NULL, [TITLE] [nvarchar](255) NULL, PRIMARY KEY CLUSTERED ( [CREATECODE_ID] ASC )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY] ) ON [PRIMARY] GO /****** Object: Table [dbo].[sys_department] Script Date: 2016/2/4 11:00:37 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO CREATE TABLE [dbo].[sys_department]( [DEPARTMENT_ID] [nvarchar](100) NOT NULL, [NAME] [nvarchar](30) NULL, [NAME_EN] [nvarchar](50) NULL, [BIANMA] [nvarchar](50) NULL, [PARENT_ID] [nvarchar](100) NULL, [BZ] [nvarchar](255) NULL, [HEADMAN] [nvarchar](30) NULL, [TEL] [nvarchar](50) NULL, [FUNCTIONS] [nvarchar](255) NULL, [ADDRESS] [nvarchar](255) NULL, PRIMARY KEY CLUSTERED ( [DEPARTMENT_ID] ASC )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY] ) ON [PRIMARY] GO /****** Object: Table [dbo].[sys_dictionaries] Script Date: 2016/2/4 11:00:37 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO CREATE TABLE [dbo].[sys_dictionaries]( [DICTIONARIES_ID] [nvarchar](100) NOT NULL, [NAME] [nvarchar](30) NULL, [NAME_EN] [nvarchar](50) NULL, [BIANMA] [nvarchar](50) NULL, [ORDER_BY] [int] NOT NULL, [PARENT_ID] [nvarchar](100) NULL, [BZ] [nvarchar](255) NULL, [TBSNAME] [nvarchar](100) NULL, PRIMARY KEY CLUSTERED ( [DICTIONARIES_ID] ASC )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY] ) ON [PRIMARY] GO /****** Object: Table [dbo].[sys_fhbutton] Script Date: 2016/2/4 11:00:37 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO CREATE TABLE [dbo].[sys_fhbutton]( [FHBUTTON_ID] [nvarchar](100) NOT NULL, [NAME] [nvarchar](30) NULL, [QX_NAME] [nvarchar](50) NULL, [BZ] [nvarchar](255) NULL, PRIMARY KEY CLUSTERED ( [FHBUTTON_ID] ASC )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY] ) ON [PRIMARY] GO /****** Object: Table [dbo].[sys_fhsms] Script Date: 2016/2/4 11:00:37 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO CREATE TABLE [dbo].[sys_fhsms]( [FHSMS_ID] [nvarchar](100) NOT NULL, [CONTENT] [nvarchar](1000) NULL, [TYPE] [nvarchar](5) NULL, [TO_USERNAME] [nvarchar](255) NULL, [FROM_USERNAME] [nvarchar](255) NULL, [SEND_TIME] [nvarchar](100) NULL, [STATUS] [nvarchar](5) NULL, [SANME_ID] [nvarchar](100) NULL, PRIMARY KEY CLUSTERED ( [FHSMS_ID] ASC )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY] ) ON [PRIMARY] GO /****** Object: Table [dbo].[sys_menu] Script Date: 2016/2/4 11:00:37 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO CREATE TABLE [dbo].[sys_menu]( [MENU_ID] [int] NOT NULL, [MENU_NAME] [nvarchar](255) NULL, [MENU_URL] [nvarchar](255) NULL, [PARENT_ID] [nvarchar](100) NULL, [MENU_ORDER] [nvarchar](100) NULL, [MENU_ICON] [nvarchar](60) NULL, [MENU_TYPE] [nvarchar](10) NULL, [MENU_STATE] [int] NULL, PRIMARY KEY CLUSTERED ( [MENU_ID] ASC )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY] ) ON [PRIMARY] GO /****** Object: Table [dbo].[sys_role] Script Date: 2016/2/4 11:00:37 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO CREATE TABLE [dbo].[sys_role]( [ROLE_ID] [nvarchar](100) NOT NULL, [ROLE_NAME] [nvarchar](100) NULL, [RIGHTS] [nvarchar](255) NULL, [PARENT_ID] [nvarchar](100) NULL, [ADD_QX] [nvarchar](255) NULL, [DEL_QX] [nvarchar](255) NULL, [EDIT_QX] [nvarchar](255) NULL, [CHA_QX] [nvarchar](255) NULL, PRIMARY KEY CLUSTERED ( [ROLE_ID] ASC )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY] ) ON [PRIMARY] GO /****** Object: Table [dbo].[sys_role_fhbutton] Script Date: 2016/2/4 11:00:37 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO CREATE TABLE [dbo].[sys_role_fhbutton]( [RB_ID] [nvarchar](100) NOT NULL, [ROLE_ID] [nvarchar](100) NULL, [BUTTON_ID] [nvarchar](100) NULL, PRIMARY KEY CLUSTERED ( [RB_ID] ASC )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY] ) ON [PRIMARY] GO /****** Object: Table [dbo].[sys_user] Script Date: 2016/2/4 11:00:37 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO SET ANSI_PADDING ON GO CREATE TABLE [dbo].[sys_user]( [USER_ID] [char](32) NOT NULL, [USERNAME] [nvarchar](100) NULL, [PASSWORD] [nvarchar](100) NULL, [NAME] [nvarchar](100) NULL, [RIGHTS] [nvarchar](255) NULL, [ROLE_ID] [nvarchar](100) NULL, [LAST_LOGIN] [nvarchar](100) NULL, [IP] [nvarchar](15) NULL, [STATUS] [nvarchar](32) NULL, [BZ] [nvarchar](255) NULL, [SKIN] [nvarchar](100) NULL, [EMAIL] [nvarchar](50) NULL, [NUMBER] [nvarchar](100) NULL, [PHONE] [nvarchar](100) NULL, CONSTRAINT [PK_sys_user] PRIMARY KEY CLUSTERED ( [USER_ID] ASC )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY] ) ON [PRIMARY] GO SET ANSI_PADDING OFF GO /****** Object: Table [dbo].[tb_pictures] Script Date: 2016/2/4 11:00:37 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO CREATE TABLE [dbo].[tb_pictures]( [PICTURES_ID] [nvarchar](100) NOT NULL, [TITLE] [nvarchar](255) NULL, [NAME] [nvarchar](255) NULL, [PATH] [nvarchar](255) NULL, [CREATETIME] [nvarchar](100) NULL, [MASTER_ID] [nvarchar](255) NULL, [BZ] [nvarchar](255) NULL, PRIMARY KEY CLUSTERED ( [PICTURES_ID] ASC )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY] ) ON [PRIMARY] GO /****** Object: Table [dbo].[weixin_command] Script Date: 2016/2/4 11:00:37 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO CREATE TABLE [dbo].[weixin_command]( [COMMAND_ID] [nvarchar](100) NOT NULL, [KEYWORD] [nvarchar](255) NULL, [COMMANDCODE] [nvarchar](255) NULL, [CREATETIME] [nvarchar](255) NULL, [STATUS] [int] NOT NULL, [BZ] [nvarchar](255) NULL, PRIMARY KEY CLUSTERED ( [COMMAND_ID] ASC )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY] ) ON [PRIMARY] GO /****** Object: Table [dbo].[weixin_imgmsg] Script Date: 2016/2/4 11:00:37 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO CREATE TABLE [dbo].[weixin_imgmsg]( [IMGMSG_ID] [nvarchar](100) NOT NULL, [KEYWORD] [nvarchar](255) NULL, [CREATETIME] [nvarchar](100) NULL, [STATUS] [int] NOT NULL, [BZ] [nvarchar](255) NULL, [TITLE1] [nvarchar](255) NULL, [DESCRIPTION1] [nvarchar](255) NULL, [IMGURL1] [nvarchar](255) NULL, [TOURL1] [nvarchar](255) NULL, [TITLE2] [nvarchar](255) NULL, [DESCRIPTION2] [nvarchar](255) NULL, [IMGURL2] [nvarchar](255) NULL, [TOURL2] [nvarchar](255) NULL, [TITLE3] [nvarchar](255) NULL, [DESCRIPTION3] [nvarchar](255) NULL, [IMGURL3] [nvarchar](255) NULL, [TOURL3] [nvarchar](255) NULL, [TITLE4] [nvarchar](255) NULL, [DESCRIPTION4] [nvarchar](255) NULL, [IMGURL4] [nvarchar](255) NULL, [TOURL4] [nvarchar](255) NULL, [TITLE5] [nvarchar](255) NULL, [DESCRIPTION5] [nvarchar](255) NULL, [IMGURL5] [nvarchar](255) NULL, [TOURL5] [nvarchar](255) NULL, [TITLE6] [nvarchar](255) NULL, [DESCRIPTION6] [nvarchar](255) NULL, [IMGURL6] [nvarchar](255) NULL, [TOURL6] [nvarchar](255) NULL, [TITLE7] [nvarchar](255) NULL, [DESCRIPTION7] [nvarchar](255) NULL, [IMGURL7] [nvarchar](255) NULL, [TOURL7] [nvarchar](255) NULL, [TITLE8] [nvarchar](255) NULL, [DESCRIPTION8] [nvarchar](255) NULL, [IMGURL8] [nvarchar](255) NULL, [TOURL8] [nvarchar](255) NULL, PRIMARY KEY CLUSTERED ( [IMGMSG_ID] ASC )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY] ) ON [PRIMARY] GO /****** Object: Table [dbo].[weixin_textmsg] Script Date: 2016/2/4 11:00:37 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO CREATE TABLE [dbo].[weixin_textmsg]( [TEXTMSG_ID] [nvarchar](100) NOT NULL, [KEYWORD] [nvarchar](255) NULL, [CONTENT] [nvarchar](255) NULL, [CREATETIME] [nvarchar](100) NULL, [STATUS] [int] NULL, [BZ] [nvarchar](255) NULL, PRIMARY KEY CLUSTERED ( [TEXTMSG_ID] ASC )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY] ) ON [PRIMARY] GO INSERT [dbo].[sys_app_user] ([USER_ID], [USERNAME], [PASSWORD], [NAME], [RIGHTS], [ROLE_ID], [LAST_LOGIN], [IP], [STATUS], [BZ], [PHONE], [SFID], [START_TIME], [END_TIME], [YEARS], [NUMBER], [EMAIL]) VALUES (N'1e89e6504be349a68c025976b3ecc1d1', N'a1', N'698d51a19d8a121ce581499d7b701668', N'会员甲', N'', N'115b386ff04f4352b060dffcd2b5d1da', N'', N'', N'1', N'121', N'1212', N'1212', N'2015-12-02', N'2015-12-25', 2, N'111', N'313596790@qq.com') INSERT [dbo].[sys_app_user] ([USER_ID], [USERNAME], [PASSWORD], [NAME], [RIGHTS], [ROLE_ID], [LAST_LOGIN], [IP], [STATUS], [BZ], [PHONE], [SFID], [START_TIME], [END_TIME], [YEARS], [NUMBER], [EMAIL]) VALUES (N'ead1f56708e4409c8d071e0a699e5633', N'a2', N'bcbe3365e6ac95ea2c0343a2395834dd', N'会员乙', N'', N'1b67fc82ce89457a8347ae53e43a347e', N'', N'', N'0', N'', N'', N'', N'2015-12-01', N'2015-12-24', 1, N'121', N'978336446@qq.com') INSERT [dbo].[sys_createcode] ([CREATECODE_ID], [PACKAGENAME], [OBJECTNAME], [TABLENAME], [FIELDLIST], [CREATETIME], [TITLE]) VALUES (N'002ea762e3e242a7a10ea5ca633701d8', N'system', N'Buttonrights', N'sys_,fh,BUTTONRIGHTS', N'NAME,fh,String,fh,名称,fh,是,fh,无,fh,255Q313596790', N'2016-01-16 23:20:36', N'按钮权限') INSERT [dbo].[sys_createcode] ([CREATECODE_ID], [PACKAGENAME], [OBJECTNAME], [TABLENAME], [FIELDLIST], [CREATETIME], [TITLE]) VALUES (N'11c0f9b57ec94cefa21d58ed5c6161ae', N'system', N'Testfh', N'FH_,fh,TESTFH', N'NAME,fh,String,fh,姓名,fh,是,fh,无,fh,255Q313596790AGE,fh,Integer,fh,年龄,fh,是,fh,无,fh,11Q313596790BIRTHDAY,fh,Date,fh,生日,fh,是,fh,无,fh,32Q313596790', N'2016-02-01 15:45:18', N'测试') INSERT [dbo].[sys_createcode] ([CREATECODE_ID], [PACKAGENAME], [OBJECTNAME], [TABLENAME], [FIELDLIST], [CREATETIME], [TITLE]) VALUES (N'c7586f931fd44c61beccd3248774c68c', N'system', N'Department', N'SYS_,fh,DEPARTMENT', N'NAME,fh,String,fh,名称,fh,是,fh,无,fh,30Q313596790NAME_EN,fh,String,fh,英文,fh,是,fh,无,fh,50Q313596790BIANMA,fh,String,fh,编码,fh,是,fh,无,fh,50Q313596790PARENT_ID,fh,String,fh,上级ID,fh,否,fh,无,fh,100Q313596790BZ,fh,String,fh,备注,fh,是,fh,无,fh,255Q313596790HEADMAN,fh,String,fh,负责人,fh,是,fh,无,fh,30Q313596790TEL,fh,String,fh,电话,fh,是,fh,无,fh,50Q313596790FUNCTIONS,fh,String,fh,部门职能,fh,是,fh,无,fh,255Q313596790ADDRESS,fh,String,fh,地址,fh,是,fh,无,fh,255Q313596790', N'2015-12-20 01:49:25', N'组织机构') INSERT [dbo].[sys_createcode] ([CREATECODE_ID], [PACKAGENAME], [OBJECTNAME], [TABLENAME], [FIELDLIST], [CREATETIME], [TITLE]) VALUES (N'dbd7b8330d774dcabd184eca8668a295', N'system', N'Fhsms', N'SYS_,fh,FHSMS', N'CONTENT,fh,String,fh,内容,fh,是,fh,无,fh,1000Q313596790TYPE,fh,String,fh,类型,fh,否,fh,无,fh,5Q313596790TO_USERNAME,fh,String,fh,收信人,fh,是,fh,无,fh,255Q313596790FROM_USERNAME,fh,String,fh,发信人,fh,是,fh,无,fh,255Q313596790SEND_TIME,fh,String,fh,发信时间,fh,是,fh,无,fh,100Q313596790STATUS,fh,String,fh,状态,fh,否,fh,无,fh,5Q313596790SANME_ID,fh,String,fh,共同ID,fh,是,fh,无,fh,100Q313596790', N'2016-01-23 01:44:15', N'站内信') INSERT [dbo].[sys_createcode] ([CREATECODE_ID], [PACKAGENAME], [OBJECTNAME], [TABLENAME], [FIELDLIST], [CREATETIME], [TITLE]) VALUES (N'fe239f8742194481a5b56f90cad71520', N'system', N'Fhbutton', N'SYS_,fh,FHBUTTON', N'NAME,fh,String,fh,名称,fh,是,fh,无,fh,30Q313596790QX_NAME,fh,String,fh,权限标识,fh,是,fh,无,fh,50Q313596790BZ,fh,String,fh,备注,fh,是,fh,无,fh,255Q313596790', N'2016-01-15 18:38:40', N'按钮管理') INSERT [dbo].[sys_department] ([DEPARTMENT_ID], [NAME], [NAME_EN], [BIANMA], [PARENT_ID], [BZ], [HEADMAN], [TEL], [FUNCTIONS], [ADDRESS]) VALUES (N'0956d8c279274fca92f4091f2a69a9ad', N'销售会计', N'xiaokuai', N'05896', N'd41af567914a409893d011aa53eda797', N'', N'', N'', N'', N'') INSERT [dbo].[sys_department] ([DEPARTMENT_ID], [NAME], [NAME_EN], [BIANMA], [PARENT_ID], [BZ], [HEADMAN], [TEL], [FUNCTIONS], [ADDRESS]) VALUES (N'3e7227e11dc14b4d9e863dd1a1fcedf6', N'成本会计', N'chengb', N'03656', N'd41af567914a409893d011aa53eda797', N'', N'', N'', N'', N'') INSERT [dbo].[sys_department] ([DEPARTMENT_ID], [NAME], [NAME_EN], [BIANMA], [PARENT_ID], [BZ], [HEADMAN], [TEL], [FUNCTIONS], [ADDRESS]) VALUES (N'5cccdb7c432449d8b853c52880058140', N'B公司', N'b', N'002', N'0', N'冶铁', N'李四', N'112', N'冶铁', N'河北') INSERT [dbo].[sys_department] ([DEPARTMENT_ID], [NAME], [NAME_EN], [BIANMA], [PARENT_ID], [BZ], [HEADMAN], [TEL], [FUNCTIONS], [ADDRESS]) VALUES (N'83a25761c618457cae2fa1211bd8696d', N'销售B组', N'xiaob', N'002365', N'cbbc84eddde947ba8af7d509e430eb70', N'', N'李四', N'', N'', N'') INSERT [dbo].[sys_department] ([DEPARTMENT_ID], [NAME], [NAME_EN], [BIANMA], [PARENT_ID], [BZ], [HEADMAN], [TEL], [FUNCTIONS], [ADDRESS]) VALUES (N'8f8b045470f342fdbc4c312ab881d62b', N'销售A组', N'xiaoA', N'0326', N'cbbc84eddde947ba8af7d509e430eb70', N'', N'张三', N'0201212', N'', N'') INSERT [dbo].[sys_department] ([DEPARTMENT_ID], [NAME], [NAME_EN], [BIANMA], [PARENT_ID], [BZ], [HEADMAN], [TEL], [FUNCTIONS], [ADDRESS]) VALUES (N'a0982dea52554225ab682cd4b421de47', N'1队', N'yidui', N'02563', N'8f8b045470f342fdbc4c312ab881d62b', N'', N'小王', N'12356989', N'', N'') INSERT [dbo].[sys_department] ([DEPARTMENT_ID], [NAME], [NAME_EN], [BIANMA], [PARENT_ID], [BZ], [HEADMAN], [TEL], [FUNCTIONS], [ADDRESS]) VALUES (N'a6c6695217ba4a4dbfe9f7e9d2c06730', N'A公司', N'a', N'001', N'0', N'挖煤', N'张三', N'110', N'洼煤矿', N'山西') INSERT [dbo].[sys_department] ([DEPARTMENT_ID], [NAME], [NAME_EN], [BIANMA], [PARENT_ID], [BZ], [HEADMAN], [TEL], [FUNCTIONS], [ADDRESS]) VALUES (N'cbbc84eddde947ba8af7d509e430eb70', N'销售部', N'xiaoshoubu', N'00201', N'5cccdb7c432449d8b853c52880058140', N'推销商品', N'小明', N'11236', N'推销商品', N'909办公室') INSERT [dbo].[sys_department] ([DEPARTMENT_ID], [NAME], [NAME_EN], [BIANMA], [PARENT_ID], [BZ], [HEADMAN], [TEL], [FUNCTIONS], [ADDRESS]) VALUES (N'd41af567914a409893d011aa53eda797', N'财务部', N'caiwubu', N'00101', N'a6c6695217ba4a4dbfe9f7e9d2c06730', N'负责发工资', N'王武', N'11236', N'管理财务', N'308办公室') INSERT [dbo].[sys_dictionaries] ([DICTIONARIES_ID], [NAME], [NAME_EN], [BIANMA], [ORDER_BY], [PARENT_ID], [BZ], [TBSNAME]) VALUES (N'096e4ec8986149d994b09e604504e38d', N'黄浦区', N'huangpu', N'0030201', 1, N'f1ea30ddef1340609c35c88fb2919bee', N'黄埔', N'') INSERT [dbo].[sys_dictionaries] ([DICTIONARIES_ID], [NAME], [NAME_EN], [BIANMA], [ORDER_BY], [PARENT_ID], [BZ], [TBSNAME]) VALUES (N'12a62a3e5bed44bba0412b7e6b733c93', N'北京', N'beijing', N'00301', 1, N'be4a8c5182c744d28282a5345783a77f', N'北京', N'') INSERT [dbo].[sys_dictionaries] ([DICTIONARIES_ID], [NAME], [NAME_EN], [BIANMA], [ORDER_BY], [PARENT_ID], [BZ], [TBSNAME]) VALUES (N'507fa87a49104c7c8cdb52fdb297da12', N'宣武区', N'xuanwuqu', N'0030101', 1, N'12a62a3e5bed44bba0412b7e6b733c93', N'宣武区', N'') INSERT [dbo].[sys_dictionaries] ([DICTIONARIES_ID], [NAME], [NAME_EN], [BIANMA], [ORDER_BY], [PARENT_ID], [BZ], [TBSNAME]) VALUES (N'8994f5995f474e2dba6cfbcdfe5ea07a', N'语文', N'yuwen', N'00201', 1, N'fce20eb06d7b4b4d8f200eda623f725c', N'语文', N'') INSERT [dbo].[sys_dictionaries] ([DICTIONARIES_ID], [NAME], [NAME_EN], [BIANMA], [ORDER_BY], [PARENT_ID], [BZ], [TBSNAME]) VALUES (N'8ea7c44af25f48b993a14f791c8d689f', N'分类', N'fenlei', N'001', 1, N'0', N'分类', N'') INSERT [dbo].[sys_dictionaries] ([DICTIONARIES_ID], [NAME], [NAME_EN], [BIANMA], [ORDER_BY], [PARENT_ID], [BZ], [TBSNAME]) VALUES (N'be4a8c5182c744d28282a5345783a77f', N'地区', N'diqu', N'003', 3, N'0', N'地区', N'') INSERT [dbo].[sys_dictionaries] ([DICTIONARIES_ID], [NAME], [NAME_EN], [BIANMA], [ORDER_BY], [PARENT_ID], [BZ], [TBSNAME]) VALUES (N'd428594b0494476aa7338d9061e23ae3', N'红色', N'red', N'00101', 1, N'8ea7c44af25f48b993a14f791c8d689f', N'红色', N'') INSERT [dbo].[sys_dictionaries] ([DICTIONARIES_ID], [NAME], [NAME_EN], [BIANMA], [ORDER_BY], [PARENT_ID], [BZ], [TBSNAME]) VALUES (N'de9afadfbed0428fa343704d6acce2c4', N'绿色', N'green', N'00102', 2, N'8ea7c44af25f48b993a14f791c8d689f', N'绿色', N'') INSERT [dbo].[sys_dictionaries] ([DICTIONARIES_ID], [NAME], [NAME_EN], [BIANMA], [ORDER_BY], [PARENT_ID], [BZ], [TBSNAME]) VALUES (N'f1ea30ddef1340609c35c88fb2919bee', N'上海', N'shanghai', N'00302', 2, N'be4a8c5182c744d28282a5345783a77f', N'上海', N'') INSERT [dbo].[sys_dictionaries] ([DICTIONARIES_ID], [NAME], [NAME_EN], [BIANMA], [ORDER_BY], [PARENT_ID], [BZ], [TBSNAME]) VALUES (N'fce20eb06d7b4b4d8f200eda623f725c', N'课程', N'kecheng', N'002', 2, N'0', N'课程', N'') INSERT [dbo].[sys_fhbutton] ([FHBUTTON_ID], [NAME], [QX_NAME], [BZ]) VALUES (N'3542adfbda73410c976e185ffe50ad06', N'导出EXCEL', N'toExcel', N'导出EXCEL') INSERT [dbo].[sys_fhbutton] ([FHBUTTON_ID], [NAME], [QX_NAME], [BZ]) VALUES (N'46992ea280ba4b72b29dedb0d4bc0106', N'发邮件', N'email', N'发送电子邮件') INSERT [dbo].[sys_fhbutton] ([FHBUTTON_ID], [NAME], [QX_NAME], [BZ]) VALUES (N'4efa162fce8340f0bd2dcd3b11d327ec', N'导入EXCEL', N'FromExcel', N'导入EXCEL到系统用户') INSERT [dbo].[sys_fhbutton] ([FHBUTTON_ID], [NAME], [QX_NAME], [BZ]) VALUES (N'cc51b694d5344d28a9aa13c84b7166cd', N'发短信', N'sms', N'发送短信') INSERT [dbo].[sys_fhbutton] ([FHBUTTON_ID], [NAME], [QX_NAME], [BZ]) VALUES (N'da7fd386de0b49ce809984f5919022b8', N'站内信', N'FHSMS', N'发送站内信') INSERT [dbo].[sys_fhsms] ([FHSMS_ID], [CONTENT], [TYPE], [TO_USERNAME], [FROM_USERNAME], [SEND_TIME], [STATUS], [SANME_ID]) VALUES (N'05879f5868824f35932ee9f2062adc03', N'你好', N'2', N'admin', N'san', N'2016-01-25 14:05:31', N'1', N'b311e893228f42d5a05dbe16917fd16f') INSERT [dbo].[sys_fhsms] ([FHSMS_ID], [CONTENT], [TYPE], [TO_USERNAME], [FROM_USERNAME], [SEND_TIME], [STATUS], [SANME_ID]) VALUES (N'2635dd035c6f4bb5a091abdd784bd899', N'你好', N'2', N'san', N'admin', N'2016-01-25 14:05:02', N'2', N'1b7637306683460f89174c2b025862b5') INSERT [dbo].[sys_fhsms] ([FHSMS_ID], [CONTENT], [TYPE], [TO_USERNAME], [FROM_USERNAME], [SEND_TIME], [STATUS], [SANME_ID]) VALUES (N'52378ccd4e2d4fe08994d1652af87c68', N'你好', N'1', N'admin', N'san', N'2016-01-25 16:26:44', N'1', N'920b20dafdfb4c09b560884eb277c51d') INSERT [dbo].[sys_fhsms] ([FHSMS_ID], [CONTENT], [TYPE], [TO_USERNAME], [FROM_USERNAME], [SEND_TIME], [STATUS], [SANME_ID]) VALUES (N'77ed13f9c49a4c4bb460c41b8580dd36', N'gggg', N'2', N'admin', N'san', N'2016-01-24 21:22:43', N'2', N'dd9ee339576e48c5b046b94fa1901d00') INSERT [dbo].[sys_fhsms] ([FHSMS_ID], [CONTENT], [TYPE], [TO_USERNAME], [FROM_USERNAME], [SEND_TIME], [STATUS], [SANME_ID]) VALUES (N'98a6869f942042a1a037d9d9f01cb50f', N'你好', N'1', N'admin', N'san', N'2016-01-25 14:05:02', N'2', N'1b7637306683460f89174c2b025862b5') INSERT [dbo].[sys_fhsms] ([FHSMS_ID], [CONTENT], [TYPE], [TO_USERNAME], [FROM_USERNAME], [SEND_TIME], [STATUS], [SANME_ID]) VALUES (N'9e00295529014b6e8a27019cbccb3da1', N'柔柔弱弱', N'1', N'admin', N'san', N'2016-01-24 21:22:57', N'1', N'a29603d613ea4e54b5678033c1bf70a6') INSERT [dbo].[sys_fhsms] ([FHSMS_ID], [CONTENT], [TYPE], [TO_USERNAME], [FROM_USERNAME], [SEND_TIME], [STATUS], [SANME_ID]) VALUES (N'd3aedeb430f640359bff86cd657a8f59', N'你好', N'1', N'admin', N'san', N'2016-01-24 21:22:12', N'1', N'f022fbdce3d845aba927edb698beb90b') INSERT [dbo].[sys_fhsms] ([FHSMS_ID], [CONTENT], [TYPE], [TO_USERNAME], [FROM_USERNAME], [SEND_TIME], [STATUS], [SANME_ID]) VALUES (N'e5376b1bd54b489cb7f2203632bd74ec', N'管理员好', N'2', N'admin', N'san', N'2016-01-25 14:06:13', N'2', N'b347b2034faf43c79b54be4627f3bd2b') INSERT [dbo].[sys_fhsms] ([FHSMS_ID], [CONTENT], [TYPE], [TO_USERNAME], [FROM_USERNAME], [SEND_TIME], [STATUS], [SANME_ID]) VALUES (N'e613ac0fcc454f32895a70b747bf4fb5', N'你也好', N'2', N'admin', N'san', N'2016-01-25 16:27:54', N'2', N'ce8dc3b15afb40f28090f8b8e13f078d') INSERT [dbo].[sys_fhsms] ([FHSMS_ID], [CONTENT], [TYPE], [TO_USERNAME], [FROM_USERNAME], [SEND_TIME], [STATUS], [SANME_ID]) VALUES (N'f25e00cfafe741a3a05e3839b66dc7aa', N'你好', N'2', N'san', N'admin', N'2016-01-25 16:26:44', N'1', N'920b20dafdfb4c09b560884eb277c51d') INSERT [dbo].[sys_menu] ([MENU_ID], [MENU_NAME], [MENU_URL], [PARENT_ID], [MENU_ORDER], [MENU_ICON], [MENU_TYPE], [MENU_STATE]) VALUES (1, N'系统管理', N'#', N'0', N'1', N'menu-icon fa fa-desktop blue', N'2', 1) INSERT [dbo].[sys_menu] ([MENU_ID], [MENU_NAME], [MENU_URL], [PARENT_ID], [MENU_ORDER], [MENU_ICON], [MENU_TYPE], [MENU_STATE]) VALUES (2, N'权限管理', N'#', N'1', N'1', N'menu-icon fa fa-lock black', N'1', 1) INSERT [dbo].[sys_menu] ([MENU_ID], [MENU_NAME], [MENU_URL], [PARENT_ID], [MENU_ORDER], [MENU_ICON], [MENU_TYPE], [MENU_STATE]) VALUES (6, N'信息管理', N'#', N'0', N'5', N'menu-icon fa fa-credit-card green', N'2', 1) INSERT [dbo].[sys_menu] ([MENU_ID], [MENU_NAME], [MENU_URL], [PARENT_ID], [MENU_ORDER], [MENU_ICON], [MENU_TYPE], [MENU_STATE]) VALUES (7, N'图片管理', N'pictures/list.do', N'6', N'1', N'menu-icon fa fa-folder-o pink', N'2', 1) INSERT [dbo].[sys_menu] ([MENU_ID], [MENU_NAME], [MENU_URL], [PARENT_ID], [MENU_ORDER], [MENU_ICON], [MENU_TYPE], [MENU_STATE]) VALUES (8, N'性能监控', N'druid/index.html', N'9', N'1', N'menu-icon fa fa-tachometer red', N'1', 1) INSERT [dbo].[sys_menu] ([MENU_ID], [MENU_NAME], [MENU_URL], [PARENT_ID], [MENU_ORDER], [MENU_ICON], [MENU_TYPE], [MENU_STATE]) VALUES (9, N'系统工具', N'#', N'0', N'3', N'menu-icon fa fa-cog black', N'2', 1) INSERT [dbo].[sys_menu] ([MENU_ID], [MENU_NAME], [MENU_URL], [PARENT_ID], [MENU_ORDER], [MENU_ICON], [MENU_TYPE], [MENU_STATE]) VALUES (10, N'接口测试', N'tool/interfaceTest.do', N'9', N'2', N'menu-icon fa fa-exchange green', N'1', 1) INSERT [dbo].[sys_menu] ([MENU_ID], [MENU_NAME], [MENU_URL], [PARENT_ID], [MENU_ORDER], [MENU_ICON], [MENU_TYPE], [MENU_STATE]) VALUES (11, N'发送邮件', N'tool/goSendEmail.do', N'9', N'3', N'menu-icon fa fa-envelope-o green', N'1', 1) INSERT [dbo].[sys_menu] ([MENU_ID], [MENU_NAME], [MENU_URL], [PARENT_ID], [MENU_ORDER], [MENU_ICON], [MENU_TYPE], [MENU_STATE]) VALUES (12, N'置二维码', N'tool/goTwoDimensionCode.do', N'9', N'4', N'menu-icon fa fa-barcode green', N'1', 1) INSERT [dbo].[sys_menu] ([MENU_ID], [MENU_NAME], [MENU_URL], [PARENT_ID], [MENU_ORDER], [MENU_ICON], [MENU_TYPE], [MENU_STATE]) VALUES (14, N'地图工具', N'tool/map.do', N'9', N'6', N'menu-icon fa fa-globe black', N'1', 1) INSERT [dbo].[sys_menu] ([MENU_ID], [MENU_NAME], [MENU_URL], [PARENT_ID], [MENU_ORDER], [MENU_ICON], [MENU_TYPE], [MENU_STATE]) VALUES (15, N'微信管理', N'#', N'0', N'4', N'menu-icon fa fa-comments purple', N'2', 1) INSERT [dbo].[sys_menu] ([MENU_ID], [MENU_NAME], [MENU_URL], [PARENT_ID], [MENU_ORDER], [MENU_ICON], [MENU_TYPE], [MENU_STATE]) VALUES (16, N'文本回复', N'textmsg/list.do', N'15', N'2', N'menu-icon fa fa-comment green', N'2', 1) INSERT [dbo].[sys_menu] ([MENU_ID], [MENU_NAME], [MENU_URL], [PARENT_ID], [MENU_ORDER], [MENU_ICON], [MENU_TYPE], [MENU_STATE]) VALUES (17, N'应用命令', N'command/list.do', N'15', N'4', N'menu-icon fa fa-comment grey', N'2', 1) INSERT [dbo].[sys_menu] ([MENU_ID], [MENU_NAME], [MENU_URL], [PARENT_ID], [MENU_ORDER], [MENU_ICON], [MENU_TYPE], [MENU_STATE]) VALUES (18, N'图文回复', N'imgmsg/list.do', N'15', N'3', N'menu-icon fa fa-comment pink', N'2', 1) INSERT [dbo].[sys_menu] ([MENU_ID], [MENU_NAME], [MENU_URL], [PARENT_ID], [MENU_ORDER], [MENU_ICON], [MENU_TYPE], [MENU_STATE]) VALUES (19, N'关注回复', N'textmsg/goSubscribe.do', N'15', N'1', N'menu-icon fa fa-comment orange', N'2', 1) INSERT [dbo].[sys_menu] ([MENU_ID], [MENU_NAME], [MENU_URL], [PARENT_ID], [MENU_ORDER], [MENU_ICON], [MENU_TYPE], [MENU_STATE]) VALUES (20, N'在线管理', N'onlinemanager/list.do', N'1', N'6', N'menu-icon fa fa-laptop green', N'1', 1) INSERT [dbo].[sys_menu] ([MENU_ID], [MENU_NAME], [MENU_URL], [PARENT_ID], [MENU_ORDER], [MENU_ICON], [MENU_TYPE], [MENU_STATE]) VALUES (21, N'打印测试', N'tool/printTest.do', N'9', N'7', N'menu-icon fa fa-hdd-o grey', N'1', 1) INSERT [dbo].[sys_menu] ([MENU_ID], [MENU_NAME], [MENU_URL], [PARENT_ID], [MENU_ORDER], [MENU_ICON], [MENU_TYPE], [MENU_STATE]) VALUES (22, N'一级菜单', N'#', N'0', N'6', N'menu-icon fa fa-fire orange', N'2', 1) INSERT [dbo].[sys_menu] ([MENU_ID], [MENU_NAME], [MENU_URL], [PARENT_ID], [MENU_ORDER], [MENU_ICON], [MENU_TYPE], [MENU_STATE]) VALUES (23, N'二级菜单', N'#', N'22', N'1', N'menu-icon fa fa-leaf black', N'1', 1) INSERT [dbo].[sys_menu] ([MENU_ID], [MENU_NAME], [MENU_URL], [PARENT_ID], [MENU_ORDER], [MENU_ICON], [MENU_TYPE], [MENU_STATE]) VALUES (24, N'三级菜单', N'#', N'23', N'1', N'menu-icon fa fa-leaf black', N'1', 1) INSERT [dbo].[sys_menu] ([MENU_ID], [MENU_NAME], [MENU_URL], [PARENT_ID], [MENU_ORDER], [MENU_ICON], [MENU_TYPE], [MENU_STATE]) VALUES (30, N'四级菜单', N'#', N'24', N'1', N'menu-icon fa fa-leaf black', N'1', 1) INSERT [dbo].[sys_menu] ([MENU_ID], [MENU_NAME], [MENU_URL], [PARENT_ID], [MENU_ORDER], [MENU_ICON], [MENU_TYPE], [MENU_STATE]) VALUES (31, N'五级菜单1', N'#', N'30', N'1', N'menu-icon fa fa-leaf black', N'1', 1) INSERT [dbo].[sys_menu] ([MENU_ID], [MENU_NAME], [MENU_URL], [PARENT_ID], [MENU_ORDER], [MENU_ICON], [MENU_TYPE], [MENU_STATE]) VALUES (32, N'五级菜单2', N'#', N'30', N'2', N'menu-icon fa fa-leaf black', N'1', 1) INSERT [dbo].[sys_menu] ([MENU_ID], [MENU_NAME], [MENU_URL], [PARENT_ID], [MENU_ORDER], [MENU_ICON], [MENU_TYPE], [MENU_STATE]) VALUES (33, N'六级菜单', N'#', N'31', N'1', N'menu-icon fa fa-leaf black', N'1', 1) INSERT [dbo].[sys_menu] ([MENU_ID], [MENU_NAME], [MENU_URL], [PARENT_ID], [MENU_ORDER], [MENU_ICON], [MENU_TYPE], [MENU_STATE]) VALUES (34, N'六级菜单2', N'login_default.do', N'31', N'2', N'menu-icon fa fa-leaf black', N'1', 1) INSERT [dbo].[sys_menu] ([MENU_ID], [MENU_NAME], [MENU_URL], [PARENT_ID], [MENU_ORDER], [MENU_ICON], [MENU_TYPE], [MENU_STATE]) VALUES (35, N'四级菜单2', N'login_default.do', N'24', N'2', N'menu-icon fa fa-leaf black', N'1', 1) INSERT [dbo].[sys_menu] ([MENU_ID], [MENU_NAME], [MENU_URL], [PARENT_ID], [MENU_ORDER], [MENU_ICON], [MENU_TYPE], [MENU_STATE]) VALUES (36, N'角色(基础权限)', N'role.do', N'2', N'1', N'menu-icon fa fa-key orange', N'1', 1) INSERT [dbo].[sys_menu] ([MENU_ID], [MENU_NAME], [MENU_URL], [PARENT_ID], [MENU_ORDER], [MENU_ICON], [MENU_TYPE], [MENU_STATE]) VALUES (37, N'按钮权限', N'buttonrights/list.do', N'2', N'2', N'menu-icon fa fa-key green', N'1', 1) INSERT [dbo].[sys_menu] ([MENU_ID], [MENU_NAME], [MENU_URL], [PARENT_ID], [MENU_ORDER], [MENU_ICON], [MENU_TYPE], [MENU_STATE]) VALUES (38, N'菜单管理', N'menu/listAllMenu.do', N'1', N'3', N'menu-icon fa fa-folder-open-o brown', N'1', 1) INSERT [dbo].[sys_menu] ([MENU_ID], [MENU_NAME], [MENU_URL], [PARENT_ID], [MENU_ORDER], [MENU_ICON], [MENU_TYPE], [MENU_STATE]) VALUES (39, N'按钮管理', N'fhbutton/list.do', N'1', N'2', N'menu-icon fa fa-download orange', N'1', 1) INSERT [dbo].[sys_menu] ([MENU_ID], [MENU_NAME], [MENU_URL], [PARENT_ID], [MENU_ORDER], [MENU_ICON], [MENU_TYPE], [MENU_STATE]) VALUES (40, N'用户管理', N'#', N'0', N'2', N'menu-icon fa fa-users blue', N'2', 1) INSERT [dbo].[sys_menu] ([MENU_ID], [MENU_NAME], [MENU_URL], [PARENT_ID], [MENU_ORDER], [MENU_ICON], [MENU_TYPE], [MENU_STATE]) VALUES (41, N'系统用户', N'user/listUsers.do', N'40', N'1', N'menu-icon fa fa-users green', N'1', 1) INSERT [dbo].[sys_menu] ([MENU_ID], [MENU_NAME], [MENU_URL], [PARENT_ID], [MENU_ORDER], [MENU_ICON], [MENU_TYPE], [MENU_STATE]) VALUES (42, N'会员管理', N'happuser/listUsers.do', N'40', N'2', N'menu-icon fa fa-users orange', N'1', 1) INSERT [dbo].[sys_menu] ([MENU_ID], [MENU_NAME], [MENU_URL], [PARENT_ID], [MENU_ORDER], [MENU_ICON], [MENU_TYPE], [MENU_STATE]) VALUES (43, N'数据字典', N'dictionaries/listAllDict.do?DICTIONARIES_ID=0', N'1', N'4', N'menu-icon fa fa-book purple', N'1', 1) INSERT [dbo].[sys_menu] ([MENU_ID], [MENU_NAME], [MENU_URL], [PARENT_ID], [MENU_ORDER], [MENU_ICON], [MENU_TYPE], [MENU_STATE]) VALUES (44, N'代码生成', N'createCode/list.do', N'9', N'0', N'menu-icon fa fa-cogs brown', N'1', 1) INSERT [dbo].[sys_menu] ([MENU_ID], [MENU_NAME], [MENU_URL], [PARENT_ID], [MENU_ORDER], [MENU_ICON], [MENU_TYPE], [MENU_STATE]) VALUES (45, N'七级菜单1', N'#', N'33', N'1', N'menu-icon fa fa-leaf black', N'1', 1) INSERT [dbo].[sys_menu] ([MENU_ID], [MENU_NAME], [MENU_URL], [PARENT_ID], [MENU_ORDER], [MENU_ICON], [MENU_TYPE], [MENU_STATE]) VALUES (46, N'七级菜单2', N'#', N'33', N'2', N'menu-icon fa fa-leaf black', N'1', 1) INSERT [dbo].[sys_menu] ([MENU_ID], [MENU_NAME], [MENU_URL], [PARENT_ID], [MENU_ORDER], [MENU_ICON], [MENU_TYPE], [MENU_STATE]) VALUES (47, N'八级菜单', N'login_default.do', N'45', N'1', N'menu-icon fa fa-leaf black', N'1', 1) INSERT [dbo].[sys_menu] ([MENU_ID], [MENU_NAME], [MENU_URL], [PARENT_ID], [MENU_ORDER], [MENU_ICON], [MENU_TYPE], [MENU_STATE]) VALUES (48, N'图表报表', N' tool/fusionchartsdemo.do', N'9', N'5', N'menu-icon fa fa-bar-chart-o black', N'1', 1) INSERT [dbo].[sys_menu] ([MENU_ID], [MENU_NAME], [MENU_URL], [PARENT_ID], [MENU_ORDER], [MENU_ICON], [MENU_TYPE], [MENU_STATE]) VALUES (49, N'组织机构', N'department/listAllDepartment.do?DEPARTMENT_ID=0', N'1', N'5', N'menu-icon fa fa-users blue', N'1', 1) INSERT [dbo].[sys_menu] ([MENU_ID], [MENU_NAME], [MENU_URL], [PARENT_ID], [MENU_ORDER], [MENU_ICON], [MENU_TYPE], [MENU_STATE]) VALUES (50, N'站内信', N'fhsms/list.do', N'6', N'2', N'menu-icon fa fa-envelope green', N'1', 1) INSERT [dbo].[sys_role] ([ROLE_ID], [ROLE_NAME], [RIGHTS], [PARENT_ID], [ADD_QX], [DEL_QX], [EDIT_QX], [CHA_QX]) VALUES (N'1', N'系统管理组', N'4503598587174854', N'0', N'1', N'1', N'1', N'1') INSERT [dbo].[sys_role] ([ROLE_ID], [ROLE_NAME], [RIGHTS], [PARENT_ID], [ADD_QX], [DEL_QX], [EDIT_QX], [CHA_QX]) VALUES (N'115b386ff04f4352b060dffcd2b5d1da', N'中级会员', N'498', N'2', N'0', N'0', N'0', N'0') INSERT [dbo].[sys_role] ([ROLE_ID], [ROLE_NAME], [RIGHTS], [PARENT_ID], [ADD_QX], [DEL_QX], [EDIT_QX], [CHA_QX]) VALUES (N'1b67fc82ce89457a8347ae53e43a347e', N'初级会员', N'498', N'2', N'0', N'0', N'0', N'0') INSERT [dbo].[sys_role] ([ROLE_ID], [ROLE_NAME], [RIGHTS], [PARENT_ID], [ADD_QX], [DEL_QX], [EDIT_QX], [CHA_QX]) VALUES (N'2', N'会员组', N'498', N'0', N'0', N'0', N'0', N'1') INSERT [dbo].[sys_role] ([ROLE_ID], [ROLE_NAME], [RIGHTS], [PARENT_ID], [ADD_QX], [DEL_QX], [EDIT_QX], [CHA_QX]) VALUES (N'3264c8e83d0248bb9e3ea6195b4c0216', N'一级管理员', N'4503598587174854', N'1', N'2251798773489606', N'2251798773489606', N'1125898866646982', N'2251798773489606') INSERT [dbo].[sys_role] ([ROLE_ID], [ROLE_NAME], [RIGHTS], [PARENT_ID], [ADD_QX], [DEL_QX], [EDIT_QX], [CHA_QX]) VALUES (N'46294b31a71c4600801724a6eb06bb26', N'职位组', N'', N'0', N'0', N'0', N'0', N'0') INSERT [dbo].[sys_role] ([ROLE_ID], [ROLE_NAME], [RIGHTS], [PARENT_ID], [ADD_QX], [DEL_QX], [EDIT_QX], [CHA_QX]) VALUES (N'5466347ac07044cb8d82990ec7f3a90e', N'主管', N'', N'46294b31a71c4600801724a6eb06bb26', N'0', N'0', N'0', N'0') INSERT [dbo].[sys_role] ([ROLE_ID], [ROLE_NAME], [RIGHTS], [PARENT_ID], [ADD_QX], [DEL_QX], [EDIT_QX], [CHA_QX]) VALUES (N'68f8e4a39efe47c7bb869e9d15ab925d', N'二级管理员', N'4503598587174854', N'1', N'0', N'0', N'2251798773489606', N'0') INSERT [dbo].[sys_role] ([ROLE_ID], [ROLE_NAME], [RIGHTS], [PARENT_ID], [ADD_QX], [DEL_QX], [EDIT_QX], [CHA_QX]) VALUES (N'856849f422774ad390a4e564054d8cc8', N'经理', N'', N'46294b31a71c4600801724a6eb06bb26', N'0', N'0', N'0', N'0') INSERT [dbo].[sys_role] ([ROLE_ID], [ROLE_NAME], [RIGHTS], [PARENT_ID], [ADD_QX], [DEL_QX], [EDIT_QX], [CHA_QX]) VALUES (N'8b70a7e67f2841e7aaba8a4d92e5ff6f', N'高级会员', N'498', N'2', N'0', N'0', N'0', N'0') INSERT [dbo].[sys_role] ([ROLE_ID], [ROLE_NAME], [RIGHTS], [PARENT_ID], [ADD_QX], [DEL_QX], [EDIT_QX], [CHA_QX]) VALUES (N'c21cecf84048434b93383182b1d98cba', N'组长', N'', N'46294b31a71c4600801724a6eb06bb26', N'0', N'0', N'0', N'0') INSERT [dbo].[sys_role] ([ROLE_ID], [ROLE_NAME], [RIGHTS], [PARENT_ID], [ADD_QX], [DEL_QX], [EDIT_QX], [CHA_QX]) VALUES (N'd449195cd8e7491080688c58e11452eb', N'总监', N'', N'46294b31a71c4600801724a6eb06bb26', N'0', N'0', N'0', N'0') INSERT [dbo].[sys_role] ([ROLE_ID], [ROLE_NAME], [RIGHTS], [PARENT_ID], [ADD_QX], [DEL_QX], [EDIT_QX], [CHA_QX]) VALUES (N'de9de2f006e145a29d52dfadda295353', N'三级管理员', N'4503598587174854', N'1', N'0', N'0', N'0', N'0') INSERT [dbo].[sys_user] ([USER_ID], [USERNAME], [PASSWORD], [NAME], [RIGHTS], [ROLE_ID], [LAST_LOGIN], [IP], [STATUS], [BZ], [SKIN], [EMAIL], [NUMBER], [PHONE]) VALUES (N'1 ', N'admin', N'de41b7fb99201d8334c23c014db35ecd92df81bc', N'系统管理员', N'1133671055321055258374707980945218933803269864762743594642571294', N'1', N'2016-02-01 16:36:10', N'127.0.0.1', N'0', N'最高统治者', N'default', N'QQ313596790@main.com', N'001', N'18788888888') INSERT [dbo].[sys_user] ([USER_ID], [USERNAME], [PASSWORD], [NAME], [RIGHTS], [ROLE_ID], [LAST_LOGIN], [IP], [STATUS], [BZ], [SKIN], [EMAIL], [NUMBER], [PHONE]) VALUES (N'69177258a06e4927b4639ab1684c3320', N'san', N'47c4a8dc64ac2f0bb46bbd8813b037c9718f9349', N'三', N'', N'3264c8e83d0248bb9e3ea6195b4c0216', N'2016-01-25 16:25:36', N'192.168.1.102', N'0', N'111', N'default', N'978336446@qq.com', N'333', N'13562202556') INSERT [dbo].[sys_user] ([USER_ID], [USERNAME], [PASSWORD], [NAME], [RIGHTS], [ROLE_ID], [LAST_LOGIN], [IP], [STATUS], [BZ], [SKIN], [EMAIL], [NUMBER], [PHONE]) VALUES (N'9991f4d7782a4ccfb8a65bd96ea7aafa', N'lisi', N'2612ade71c1e48cd7150b5f4df152faa699cedfe', N'李四', N'', N'3264c8e83d0248bb9e3ea6195b4c0216', N'2016-01-06 01:24:26', N'127.0.0.1', N'0', N'小李', N'default', N'313596790@qq.com', N'1102', N'13566233663') INSERT [dbo].[sys_user] ([USER_ID], [USERNAME], [PASSWORD], [NAME], [RIGHTS], [ROLE_ID], [LAST_LOGIN], [IP], [STATUS], [BZ], [SKIN], [EMAIL], [NUMBER], [PHONE]) VALUES (N'e29149962e944589bb7da23ad18ddeed', N'zhangsan', N'c2da1419caf053885c492e10ebde421581cdc03f', N'张三', N'', N'3264c8e83d0248bb9e3ea6195b4c0216', N'', N'', N'0', N'小张', N'default', N'zhangsan@www.com', N'1101', N'2147483647') INSERT [dbo].[tb_pictures] ([PICTURES_ID], [TITLE], [NAME], [PATH], [CREATETIME], [MASTER_ID], [BZ]) VALUES (N'b06010340ee54cfab49b8bfbe2387557', N'图片', N'5e6ba5ad3067482e9a8063b0627ee983.png', N'20160125/5e6ba5ad3067482e9a8063b0627ee983.png', N'2016-01-25 16:49:44', N'1', N'图片管理处上传') INSERT [dbo].[tb_pictures] ([PICTURES_ID], [TITLE], [NAME], [PATH], [CREATETIME], [MASTER_ID], [BZ]) VALUES (N'c9f1eca620c94c27bfa7028c66911f41', N'图片', N'928da750ec8542ceb7b2495f45ea6a9e.jpg', N'20160125/928da750ec8542ceb7b2495f45ea6a9e.jpg', N'2016-01-25 16:49:44', N'1', N'图片管理处上传') INSERT [dbo].[weixin_command] ([COMMAND_ID], [KEYWORD], [COMMANDCODE], [CREATETIME], [STATUS], [BZ]) VALUES (N'2636750f6978451b8330874c9be042c2', N'锁定服务器', N'rundll32.exe user32.dll,LockWorkStation', N'2015-05-10 21:25:06', 1, N'锁定计算机') INSERT [dbo].[weixin_command] ([COMMAND_ID], [KEYWORD], [COMMANDCODE], [CREATETIME], [STATUS], [BZ]) VALUES (N'46217c6d44354010823241ef484f7214', N'打开浏览器', N'C:/Program Files/Internet Explorer/iexplore.exe', N'2015-05-09 02:43:02', 1, N'打开浏览器操作') INSERT [dbo].[weixin_command] ([COMMAND_ID], [KEYWORD], [COMMANDCODE], [CREATETIME], [STATUS], [BZ]) VALUES (N'576adcecce504bf3bb34c6b4da79a177', N'关闭浏览器', N'taskkill /f /im iexplore.exe', N'2015-05-09 02:36:48', 2, N'关闭浏览器操作') INSERT [dbo].[weixin_command] ([COMMAND_ID], [KEYWORD], [COMMANDCODE], [CREATETIME], [STATUS], [BZ]) VALUES (N'854a157c6d99499493f4cc303674c01f', N'关闭QQ', N'taskkill /f /im qq.exe', N'2015-05-10 21:25:46', 1, N'关闭QQ') INSERT [dbo].[weixin_command] ([COMMAND_ID], [KEYWORD], [COMMANDCODE], [CREATETIME], [STATUS], [BZ]) VALUES (N'ab3a8c6310ca4dc8b803ecc547e55ae7', N'打开QQ', N'D:/SOFT/QQ/QQ/Bin/qq.exe', N'2015-05-10 21:25:25', 1, N'打开QQ') INSERT [dbo].[weixin_textmsg] ([TEXTMSG_ID], [KEYWORD], [CONTENT], [CREATETIME], [STATUS], [BZ]) VALUES (N'63681adbe7144f10b66d6863e07f23c2', N'你好', N'你也好', N'2015-05-09 02:39:23', 1, N'文本回复') INSERT [dbo].[weixin_textmsg] ([TEXTMSG_ID], [KEYWORD], [CONTENT], [CREATETIME], [STATUS], [BZ]) VALUES (N'695cd74779734231928a253107ab0eeb', N'吃饭', N'吃了噢噢噢噢', N'2015-05-10 22:52:27', 1, N'文本回复') INSERT [dbo].[weixin_textmsg] ([TEXTMSG_ID], [KEYWORD], [CONTENT], [CREATETIME], [STATUS], [BZ]) VALUES (N'd4738af7aea74a6ca1a5fb25a98f9acb', N'关注', N'这里是关注后回复的内容', N'2015-05-11 02:12:36', 1, N'关注回复') ALTER TABLE [dbo].[FH_TESTFH] ADD DEFAULT (NULL) FOR [NAME] GO ALTER TABLE [dbo].[FH_TESTFH] ADD DEFAULT (NULL) FOR [BIRTHDAY] GO ALTER TABLE [dbo].[sys_app_user] ADD DEFAULT (NULL) FOR [USERNAME] GO ALTER TABLE [dbo].[sys_app_user] ADD DEFAULT (NULL) FOR [PASSWORD] GO ALTER TABLE [dbo].[sys_app_user] ADD DEFAULT (NULL) FOR [NAME] GO ALTER TABLE [dbo].[sys_app_user] ADD DEFAULT (NULL) FOR [RIGHTS] GO ALTER TABLE [dbo].[sys_app_user] ADD DEFAULT (NULL) FOR [ROLE_ID] GO ALTER TABLE [dbo].[sys_app_user] ADD DEFAULT (NULL) FOR [LAST_LOGIN] GO ALTER TABLE [dbo].[sys_app_user] ADD DEFAULT (NULL) FOR [IP] GO ALTER TABLE [dbo].[sys_app_user] ADD DEFAULT (NULL) FOR [STATUS] GO ALTER TABLE [dbo].[sys_app_user] ADD DEFAULT (NULL) FOR [BZ] GO ALTER TABLE [dbo].[sys_app_user] ADD DEFAULT (NULL) FOR [PHONE] GO ALTER TABLE [dbo].[sys_app_user] ADD DEFAULT (NULL) FOR [SFID] GO ALTER TABLE [dbo].[sys_app_user] ADD DEFAULT (NULL) FOR [START_TIME] GO ALTER TABLE [dbo].[sys_app_user] ADD DEFAULT (NULL) FOR [END_TIME] GO ALTER TABLE [dbo].[sys_app_user] ADD DEFAULT (NULL) FOR [YEARS] GO ALTER TABLE [dbo].[sys_app_user] ADD DEFAULT (NULL) FOR [NUMBER] GO ALTER TABLE [dbo].[sys_app_user] ADD DEFAULT (NULL) FOR [EMAIL] GO ALTER TABLE [dbo].[sys_createcode] ADD DEFAULT (NULL) FOR [PACKAGENAME] GO ALTER TABLE [dbo].[sys_createcode] ADD DEFAULT (NULL) FOR [OBJECTNAME] GO ALTER TABLE [dbo].[sys_createcode] ADD DEFAULT (NULL) FOR [TABLENAME] GO ALTER TABLE [dbo].[sys_createcode] ADD DEFAULT (NULL) FOR [FIELDLIST] GO ALTER TABLE [dbo].[sys_createcode] ADD DEFAULT (NULL) FOR [CREATETIME] GO ALTER TABLE [dbo].[sys_createcode] ADD DEFAULT (NULL) FOR [TITLE] GO ALTER TABLE [dbo].[sys_department] ADD DEFAULT (NULL) FOR [NAME] GO ALTER TABLE [dbo].[sys_department] ADD DEFAULT (NULL) FOR [NAME_EN] GO ALTER TABLE [dbo].[sys_department] ADD DEFAULT (NULL) FOR [BIANMA] GO ALTER TABLE [dbo].[sys_department] ADD DEFAULT (NULL) FOR [PARENT_ID] GO ALTER TABLE [dbo].[sys_department] ADD DEFAULT (NULL) FOR [BZ] GO ALTER TABLE [dbo].[sys_department] ADD DEFAULT (NULL) FOR [HEADMAN] GO ALTER TABLE [dbo].[sys_department] ADD DEFAULT (NULL) FOR [TEL] GO ALTER TABLE [dbo].[sys_department] ADD DEFAULT (NULL) FOR [FUNCTIONS] GO ALTER TABLE [dbo].[sys_department] ADD DEFAULT (NULL) FOR [ADDRESS] GO ALTER TABLE [dbo].[sys_dictionaries] ADD DEFAULT (NULL) FOR [NAME] GO ALTER TABLE [dbo].[sys_dictionaries] ADD DEFAULT (NULL) FOR [NAME_EN] GO ALTER TABLE [dbo].[sys_dictionaries] ADD DEFAULT (NULL) FOR [BIANMA] GO ALTER TABLE [dbo].[sys_dictionaries] ADD DEFAULT (NULL) FOR [PARENT_ID] GO ALTER TABLE [dbo].[sys_dictionaries] ADD DEFAULT (NULL) FOR [BZ] GO ALTER TABLE [dbo].[sys_dictionaries] ADD DEFAULT (NULL) FOR [TBSNAME] GO ALTER TABLE [dbo].[sys_fhbutton] ADD DEFAULT (NULL) FOR [NAME] GO ALTER TABLE [dbo].[sys_fhbutton] ADD DEFAULT (NULL) FOR [QX_NAME] GO ALTER TABLE [dbo].[sys_fhbutton] ADD DEFAULT (NULL) FOR [BZ] GO ALTER TABLE [dbo].[sys_fhsms] ADD DEFAULT (NULL) FOR [CONTENT] GO ALTER TABLE [dbo].[sys_fhsms] ADD DEFAULT (NULL) FOR [TYPE] GO ALTER TABLE [dbo].[sys_fhsms] ADD DEFAULT (NULL) FOR [TO_USERNAME] GO ALTER TABLE [dbo].[sys_fhsms] ADD DEFAULT (NULL) FOR [FROM_USERNAME] GO ALTER TABLE [dbo].[sys_fhsms] ADD DEFAULT (NULL) FOR [SEND_TIME] GO ALTER TABLE [dbo].[sys_fhsms] ADD DEFAULT (NULL) FOR [STATUS] GO ALTER TABLE [dbo].[sys_fhsms] ADD DEFAULT (NULL) FOR [SANME_ID] GO ALTER TABLE [dbo].[sys_menu] ADD DEFAULT (NULL) FOR [MENU_NAME] GO ALTER TABLE [dbo].[sys_menu] ADD DEFAULT (NULL) FOR [MENU_URL] GO ALTER TABLE [dbo].[sys_menu] ADD DEFAULT (NULL) FOR [PARENT_ID] GO ALTER TABLE [dbo].[sys_menu] ADD DEFAULT (NULL) FOR [MENU_ORDER] GO ALTER TABLE [dbo].[sys_menu] ADD DEFAULT (NULL) FOR [MENU_ICON] GO ALTER TABLE [dbo].[sys_menu] ADD DEFAULT (NULL) FOR [MENU_TYPE] GO ALTER TABLE [dbo].[sys_menu] ADD DEFAULT (NULL) FOR [MENU_STATE] GO ALTER TABLE [dbo].[sys_role] ADD DEFAULT (NULL) FOR [ROLE_NAME] GO ALTER TABLE [dbo].[sys_role] ADD DEFAULT (NULL) FOR [RIGHTS] GO ALTER TABLE [dbo].[sys_role] ADD DEFAULT (NULL) FOR [PARENT_ID] GO ALTER TABLE [dbo].[sys_role] ADD DEFAULT (NULL) FOR [ADD_QX] GO ALTER TABLE [dbo].[sys_role] ADD DEFAULT (NULL) FOR [DEL_QX] GO ALTER TABLE [dbo].[sys_role] ADD DEFAULT (NULL) FOR [EDIT_QX] GO ALTER TABLE [dbo].[sys_role] ADD DEFAULT (NULL) FOR [CHA_QX] GO ALTER TABLE [dbo].[sys_role_fhbutton] ADD DEFAULT (NULL) FOR [ROLE_ID] GO ALTER TABLE [dbo].[sys_role_fhbutton] ADD DEFAULT (NULL) FOR [BUTTON_ID] GO ALTER TABLE [dbo].[tb_pictures] ADD DEFAULT (NULL) FOR [TITLE] GO ALTER TABLE [dbo].[tb_pictures] ADD DEFAULT (NULL) FOR [NAME] GO ALTER TABLE [dbo].[tb_pictures] ADD DEFAULT (NULL) FOR [PATH] GO ALTER TABLE [dbo].[tb_pictures] ADD DEFAULT (NULL) FOR [CREATETIME] GO ALTER TABLE [dbo].[tb_pictures] ADD DEFAULT (NULL) FOR [MASTER_ID] GO ALTER TABLE [dbo].[tb_pictures] ADD DEFAULT (NULL) FOR [BZ] GO ALTER TABLE [dbo].[weixin_command] ADD DEFAULT (NULL) FOR [KEYWORD] GO ALTER TABLE [dbo].[weixin_command] ADD DEFAULT (NULL) FOR [COMMANDCODE] GO ALTER TABLE [dbo].[weixin_command] ADD DEFAULT (NULL) FOR [CREATETIME] GO ALTER TABLE [dbo].[weixin_command] ADD DEFAULT (NULL) FOR [BZ] GO ALTER TABLE [dbo].[weixin_imgmsg] ADD DEFAULT (NULL) FOR [KEYWORD] GO ALTER TABLE [dbo].[weixin_imgmsg] ADD DEFAULT (NULL) FOR [CREATETIME] GO ALTER TABLE [dbo].[weixin_imgmsg] ADD DEFAULT (NULL) FOR [BZ] GO ALTER TABLE [dbo].[weixin_imgmsg] ADD DEFAULT (NULL) FOR [TITLE1] GO ALTER TABLE [dbo].[weixin_imgmsg] ADD DEFAULT (NULL) FOR [DESCRIPTION1] GO ALTER TABLE [dbo].[weixin_imgmsg] ADD DEFAULT (NULL) FOR [IMGURL1] GO ALTER TABLE [dbo].[weixin_imgmsg] ADD DEFAULT (NULL) FOR [TOURL1] GO ALTER TABLE [dbo].[weixin_imgmsg] ADD DEFAULT (NULL) FOR [TITLE2] GO ALTER TABLE [dbo].[weixin_imgmsg] ADD DEFAULT (NULL) FOR [DESCRIPTION2] GO ALTER TABLE [dbo].[weixin_imgmsg] ADD DEFAULT (NULL) FOR [IMGURL2] GO ALTER TABLE [dbo].[weixin_imgmsg] ADD DEFAULT (NULL) FOR [TOURL2] GO ALTER TABLE [dbo].[weixin_imgmsg] ADD DEFAULT (NULL) FOR [TITLE3] GO ALTER TABLE [dbo].[weixin_imgmsg] ADD DEFAULT (NULL) FOR [DESCRIPTION3] GO ALTER TABLE [dbo].[weixin_imgmsg] ADD DEFAULT (NULL) FOR [IMGURL3] GO ALTER TABLE [dbo].[weixin_imgmsg] ADD DEFAULT (NULL) FOR [TOURL3] GO ALTER TABLE [dbo].[weixin_imgmsg] ADD DEFAULT (NULL) FOR [TITLE4] GO ALTER TABLE [dbo].[weixin_imgmsg] ADD DEFAULT (NULL) FOR [DESCRIPTION4] GO ALTER TABLE [dbo].[weixin_imgmsg] ADD DEFAULT (NULL) FOR [IMGURL4] GO ALTER TABLE [dbo].[weixin_imgmsg] ADD DEFAULT (NULL) FOR [TOURL4] GO ALTER TABLE [dbo].[weixin_imgmsg] ADD DEFAULT (NULL) FOR [TITLE5] GO ALTER TABLE [dbo].[weixin_imgmsg] ADD DEFAULT (NULL) FOR [DESCRIPTION5] GO ALTER TABLE [dbo].[weixin_imgmsg] ADD DEFAULT (NULL) FOR [IMGURL5] GO ALTER TABLE [dbo].[weixin_imgmsg] ADD DEFAULT (NULL) FOR [TOURL5] GO ALTER TABLE [dbo].[weixin_imgmsg] ADD DEFAULT (NULL) FOR [TITLE6] GO ALTER TABLE [dbo].[weixin_imgmsg] ADD DEFAULT (NULL) FOR [DESCRIPTION6] GO ALTER TABLE [dbo].[weixin_imgmsg] ADD DEFAULT (NULL) FOR [IMGURL6] GO ALTER TABLE [dbo].[weixin_imgmsg] ADD DEFAULT (NULL) FOR [TOURL6] GO ALTER TABLE [dbo].[weixin_imgmsg] ADD DEFAULT (NULL) FOR [TITLE7] GO ALTER TABLE [dbo].[weixin_imgmsg] ADD DEFAULT (NULL) FOR [DESCRIPTION7] GO ALTER TABLE [dbo].[weixin_imgmsg] ADD DEFAULT (NULL) FOR [IMGURL7] GO ALTER TABLE [dbo].[weixin_imgmsg] ADD DEFAULT (NULL) FOR [TOURL7] GO ALTER TABLE [dbo].[weixin_imgmsg] ADD DEFAULT (NULL) FOR [TITLE8] GO ALTER TABLE [dbo].[weixin_imgmsg] ADD DEFAULT (NULL) FOR [DESCRIPTION8] GO ALTER TABLE [dbo].[weixin_imgmsg] ADD DEFAULT (NULL) FOR [IMGURL8] GO ALTER TABLE [dbo].[weixin_imgmsg] ADD DEFAULT (NULL) FOR [TOURL8] GO ALTER TABLE [dbo].[weixin_textmsg] ADD DEFAULT (NULL) FOR [KEYWORD] GO ALTER TABLE [dbo].[weixin_textmsg] ADD DEFAULT (NULL) FOR [CONTENT] GO ALTER TABLE [dbo].[weixin_textmsg] ADD DEFAULT (NULL) FOR [CREATETIME] GO ALTER TABLE [dbo].[weixin_textmsg] ADD DEFAULT (NULL) FOR [STATUS] GO ALTER TABLE [dbo].[weixin_textmsg] ADD DEFAULT (NULL) FOR [BZ] GO ALTER TABLE [dbo].[sys_role_fhbutton] WITH NOCHECK ADD CONSTRAINT [FK_sys_role_fhbutton_sys_fhbutton] FOREIGN KEY([BUTTON_ID]) REFERENCES [dbo].[sys_fhbutton] ([FHBUTTON_ID]) ON DELETE CASCADE NOT FOR REPLICATION GO ALTER TABLE [dbo].[sys_role_fhbutton] CHECK CONSTRAINT [FK_sys_role_fhbutton_sys_fhbutton] GO ALTER TABLE [dbo].[sys_role_fhbutton] WITH NOCHECK ADD CONSTRAINT [FK_sys_role_fhbutton_sys_role] FOREIGN KEY([ROLE_ID]) REFERENCES [dbo].[sys_role] ([ROLE_ID]) ON DELETE CASCADE NOT FOR REPLICATION GO ALTER TABLE [dbo].[sys_role_fhbutton] CHECK CONSTRAINT [FK_sys_role_fhbutton_sys_role] GO USE [master] GO ALTER DATABASE [fhadmin] SET READ_WRITE GO ``` 老是语法错误啊 sql没问题的 是不是 master数据库弄坏了 消息 102,级别 15,状态 1,第 3 行 'CONTAINMENT' 附近有语法错误。 消息 15048,级别 16,状态 1,第 1 行 数据库兼容级别的有效值为 80、90 或 100。 消息 102,级别 15,状态 1,第 4 行 'en' 附近有语法错误。 消息 168,级别 15,状态 1,第 422 行 浮点值 '3e7227' 超出了计算机表示范围(8 个字节)。 消息 168,级别 15,状态 1,第 450 行 浮点值 '9e00295529014' 超出了计算机表示范围(8 个字节)。 消息 1007,级别 15,状态 1,第 509 行 数字 '1133671055321055258374707980945218933803269864762743594642571294' 超出了数值表示范围(最大精度为 38)。 消息 105,级别 15,状态 1,第 522 行 字符串 ')
paoding分词器的问题,各位看看能解决不??
package cn.itcast.lucene.analyzer; import java.io.StringReader; import jeasy.analysis.MMAnalyzer; import net.paoding.analysis.analyzer.PaodingAnalyzer; import net.paoding.analysis.knife.Paoding; import org.apache.lucene.analysis.Analyzer; import org.apache.lucene.analysis.SimpleAnalyzer; import org.apache.lucene.analysis.Token; import org.apache.lucene.analysis.TokenStream; import org.apache.lucene.analysis.cjk.CJKAnalyzer; import org.apache.lucene.analysis.standard.StandardAnalyzer; import org.junit.Test; public class AnalyzerTest { String enText = "IndexWriter addDocument's a javadoc.txt"; // String zhText = "我们是中国人"; // String zhText = "小笑话_总统的房间 Room .txt"; String zhText = "一位绅士到旅游胜地的一家饭店要开个房间"; Analyzer en1 = new StandardAnalyzer(); // 单字分词 Analyzer en2 = new SimpleAnalyzer(); Analyzer zh1 = new CJKAnalyzer(); // 二分法分词 Analyzer zh2 = new MMAnalyzer(); // 词库分词 Analyzer zh3 = new PaodingAnalyzer(); @Test public void test() throws Exception { // analyze(en2, enText); // analyze(en1, zhText); // analyze(zh1, zhText);[size=xx-small][/size] analyze(zh3, zhText); } public void analyze(Analyzer analyzer, String text) throws Exception { System.out.println("--> 分词器:" + analyzer.getClass()); TokenStream tokenStream = analyzer.tokenStream("content", new StringReader(text)); for (Token token = new Token(); (token = tokenStream.next(token)) != null;) { System.out.println(token); } } } [color=darkred]在项目里创建了paoding-dic-home.properties这个文件,内容如下:[/color] paoding.dic.home=E:/lucene/paoding-analysis-2.0.4-beta/dic paoding.dic.detector.interval=60 [color=darkred]当加上PaoDing分词器的时候就报错,注释掉就没有问题。 错误如下:[/color] net.paoding.analysis.exception.PaodingAnalysisException: please set a system env PAODING_DIC_HOME or Config paoding.dic.home in paoding-dic-home.properties point to the dictionaries! at net.paoding.analysis.knife.PaodingMaker.setDicHomeProperties(PaodingMaker.java:320) at net.paoding.analysis.knife.PaodingMaker.getDicHome(PaodingMaker.java:261) at net.paoding.analysis.knife.PaodingMaker.loadProperties(PaodingMaker.java:189) at net.paoding.analysis.knife.PaodingMaker.loadProperties(PaodingMaker.java:228) at net.paoding.analysis.knife.PaodingMaker.loadProperties(PaodingMaker.java:228) at net.paoding.analysis.knife.PaodingMaker.getProperties(PaodingMaker.java:130) at net.paoding.analysis.analyzer.PaodingAnalyzer.init(PaodingAnalyzer.java:70) at net.paoding.analysis.analyzer.PaodingAnalyzer.<init>(PaodingAnalyzer.java:59) at net.paoding.analysis.analyzer.PaodingAnalyzer.<init>(PaodingAnalyzer.java:52) at cn.itcast.lucene.analyzer.AnalyzerTest11.<init>(AnalyzerTest11.java:30) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source) at java.lang.reflect.Constructor.newInstance(Unknown Source) at org.junit.runners.BlockJUnit4ClassRunner.createTest(BlockJUnit4ClassRunner.java:171) at org.junit.runners.BlockJUnit4ClassRunner$1.runReflectiveCall(BlockJUnit4ClassRunner.java:216) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15) at org.junit.runners.BlockJUnit4ClassRunner.methodBlock(BlockJUnit4ClassRunner.java:213) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:73) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:46) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:180) at org.junit.runners.ParentRunner.access$000(ParentRunner.java:41) at org.junit.runners.ParentRunner$1.evaluate(ParentRunner.java:173) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28) at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31) at org.junit.runners.ParentRunner.run(ParentRunner.java:220) at org.eclipse.jdt.internal.junit4.runner.JUnit4TestReference.run(JUnit4TestReference.java:46) at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38) at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:467) at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:683) at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:390) at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:197)
终于明白阿里百度这样的大公司,为什么面试经常拿ThreadLocal考验求职者了
点击上面↑「爱开发」关注我们每晚10点,捕获技术思考和创业资源洞察什么是ThreadLocalThreadLocal是一个本地线程副本变量工具类,各个线程都拥有一份线程私...
《奇巧淫技》系列-python!!每天早上八点自动发送天气预报邮件到QQ邮箱
将代码部署服务器,每日早上定时获取到天气数据,并发送到邮箱。 也可以说是一个小人工智障。 思路可以运用在不同地方,主要介绍的是思路。
加快推动区块链技术和产业创新发展,2019可信区块链峰会在京召开
11月8日,由中国信息通信研究院、中国通信标准化协会、中国互联网协会、可信区块链推进计划联合主办,科技行者协办的2019可信区块链峰会将在北京悠唐皇冠假日酒店开幕。   区块链技术被认为是继蒸汽机、电力、互联网之后,下一代颠覆性的核心技术。如果说蒸汽机释放了人类的生产力,电力解决了人类基本的生活需求,互联网彻底改变了信息传递的方式,区块链作为构造信任的技术有重要的价值。   1...
阿里面试官问我:如何设计秒杀系统?我的回答让他比起大拇指
你知道的越多,你不知道的越多 点赞再看,养成习惯 GitHub上已经开源 https://github.com/JavaFamily 有一线大厂面试点脑图和个人联系方式,欢迎Star和指教 前言 Redis在互联网技术存储方面使用如此广泛,几乎所有的后端技术面试官都要在Redis的使用和原理方面对小伙伴们进行360°的刁难。 作为一个在互联网公司面一次拿一次Offer的面霸,打败了...
C语言魔塔游戏
很早就很想写这个,今天终于写完了。 游戏截图: 编译环境: VS2017 游戏需要一些图片,如果有想要的或者对游戏有什么看法的可以加我的QQ 2985486630 讨论,如果暂时没有回应,可以在博客下方留言,到时候我会看到。 下面我来介绍一下游戏的主要功能和实现方式 首先是玩家的定义,使用结构体,这个名字是可以自己改变的 struct gamerole { char n...
面试官问我:什么是消息队列?什么场景需要他?用了会出现什么问题?
你知道的越多,你不知道的越多 点赞再看,养成习惯 GitHub上已经开源 https://github.com/JavaFamily 有一线大厂面试点脑图、个人联系方式和人才交流群,欢迎Star和完善 前言 消息队列在互联网技术存储方面使用如此广泛,几乎所有的后端技术面试官都要在消息队列的使用和原理方面对小伙伴们进行360°的刁难。 作为一个在互联网公司面一次拿一次Offer的面霸...
Android性能优化(4):UI渲染机制以及优化
文章目录1. 渲染机制分析1.1 渲染机制1.2 卡顿现象1.3 内存抖动2. 渲染优化方式2.1 过度绘制优化2.1.1 Show GPU overdraw2.1.2 Profile GPU Rendering2.2 卡顿优化2.2.1 SysTrace2.2.2 TraceView 在从Android 6.0源码的角度剖析View的绘制原理一文中,我们了解到View的绘制流程有三个步骤,即m...
微服务中的Kafka与Micronaut
今天,我们将通过Apache Kafka主题构建一些彼此异步通信的微服务。我们使用Micronaut框架,它为与Kafka集成提供专门的库。让我们简要介绍一下示例系统的体系结构。我们有四个微型服务:订单服务,行程服务,司机服务和乘客服务。这些应用程序的实现非常简单。它们都有内存存储,并连接到同一个Kafka实例。 我们系统的主要目标是为客户安排行程。订单服务应用程序还充当网关。它接收来自客户的请求...
致 Python 初学者们!
作者| 许向武 责编 | 屠敏 出品 | CSDN 博客 前言 在 Python 进阶的过程中,相信很多同学应该大致上学习了很多 Python 的基础知识,也正在努力成长。在此期间,一定遇到了很多的困惑,对未来的学习方向感到迷茫。我非常理解你们所面临的处境。我从2007年开始接触 Python 这门编程语言,从2009年开始单一使用 Python 应对所有的开发工作,直至今...
究竟你适不适合买Mac?
我清晰的记得,刚买的macbook pro回到家,开机后第一件事情,就是上了淘宝网,花了500元钱,找了一个上门维修电脑的师傅,上门给我装了一个windows系统。。。。。。 表砍我。。。 当时买mac的初衷,只是想要个固态硬盘的笔记本,用来运行一些复杂的扑克软件。而看了当时所有的SSD笔记本后,最终决定,还是买个好(xiong)看(da)的。 已经有好几个朋友问我mba怎么样了,所以今天尽量客观...
程序员一般通过什么途径接私活?
二哥,你好,我想知道一般程序猿都如何接私活,我也想接,能告诉我一些方法吗? 上面是一个读者“烦不烦”问我的一个问题。其实不止是“烦不烦”,还有很多读者问过我类似这样的问题。 我接的私活不算多,挣到的钱也没有多少,加起来不到 20W。说实话,这个数目说出来我是有点心虚的,毕竟太少了,大家轻喷。但我想,恰好配得上“一般程序员”这个称号啊。毕竟苍蝇再小也是肉,我也算是有经验的人了。 唾弃接私活、做外...
字节跳动面试官这样问消息队列:分布式事务、重复消费、顺序消费,我整理了一下
你知道的越多,你不知道的越多 点赞再看,养成习惯 GitHub上已经开源 https://github.com/JavaFamily 有一线大厂面试点脑图、个人联系方式和人才交流群,欢迎Star和完善 前言 消息队列在互联网技术存储方面使用如此广泛,几乎所有的后端技术面试官都要在消息队列的使用和原理方面对小伙伴们进行360°的刁难。 作为一个在互联网公司面一次拿一次Offer的面霸...
Python爬虫爬取淘宝,京东商品信息
小编是一个理科生,不善长说一些废话。简单介绍下原理然后直接上代码。 使用的工具(Python+pycharm2019.3+selenium+xpath+chromedriver)其中要使用pycharm也可以私聊我selenium是一个框架可以通过pip下载 pip installselenium -ihttps://pypi.tuna.tsinghua.edu.cn/simple/ ...
阿里程序员写了一个新手都写不出的低级bug,被骂惨了。
这种新手都不会范的错,居然被一个工作好几年的小伙子写出来,差点被当场开除了。
Java工作4年来应聘要16K最后没要,细节如下。。。
前奏: 今天2B哥和大家分享一位前几天面试的一位应聘者,工作4年26岁,统招本科。 以下就是他的简历和面试情况。 基本情况: 专业技能: 1、&nbsp;熟悉Sping了解SpringMVC、SpringBoot、Mybatis等框架、了解SpringCloud微服务 2、&nbsp;熟悉常用项目管理工具:SVN、GIT、MAVEN、Jenkins 3、&nbsp;熟悉Nginx、tomca...
SpringBoot2.x系列教程(三十六)SpringBoot之Tomcat配置
Spring Boot默认内嵌的Tomcat为Servlet容器,关于Tomcat的所有属性都在ServerProperties配置类中。同时,也可以实现一些接口来自定义内嵌Servlet容器和内嵌Tomcat等的配置。 关于此配置,网络上有大量的资料,但都是基于SpringBoot1.5.x版本,并不适合当前最新版本。本文将带大家了解一下最新版本的使用。 ServerProperties的部分源...
Python绘图,圣诞树,花,爱心 | Turtle篇
每周每日,分享Python实战代码,入门资料,进阶资料,基础语法,爬虫,数据分析,web网站,机器学习,深度学习等等。 公众号回复【进群】沟通交流吧,QQ扫码进群学习吧 微信群 QQ群 1.画圣诞树 import turtle screen = turtle.Screen() screen.setup(800,600) circle = turtle.Turtle()...
作为一个程序员,CPU的这些硬核知识你必须会!
CPU对每个程序员来说,是个既熟悉又陌生的东西? 如果你只知道CPU是中央处理器的话,那可能对你并没有什么用,那么作为程序员的我们,必须要搞懂的就是CPU这家伙是如何运行的,尤其要搞懂它里面的寄存器是怎么一回事,因为这将让你从底层明白程序的运行机制。 随我一起,来好好认识下CPU这货吧 把CPU掰开来看 对于CPU来说,我们首先就要搞明白它是怎么回事,也就是它的内部构造,当然,CPU那么牛的一个东...
破14亿,Python分析我国存在哪些人口危机!
一、背景 二、爬取数据 三、数据分析 1、总人口 2、男女人口比例 3、人口城镇化 4、人口增长率 5、人口老化(抚养比) 6、各省人口 7、世界人口 四、遇到的问题 遇到的问题 1、数据分页,需要获取从1949-2018年数据,观察到有近20年参数:LAST20,由此推测获取近70年的参数可设置为:LAST70 2、2019年数据没有放上去,可以手动添加上去 3、将数据进行 行列转换 4、列名...
听说想当黑客的都玩过这个Monyer游戏(1~14攻略)
第零关 进入传送门开始第0关(游戏链接) 请点击链接进入第1关: 连接在左边→ ←连接在右边 看不到啊。。。。(只能看到一堆大佬做完的留名,也能看到菜鸡的我,在后面~~) 直接fn+f12吧 &lt;span&gt;连接在左边→&lt;/span&gt; &lt;a href="first.php"&gt;&lt;/a&gt; &lt;span&gt;←连接在右边&lt;/span&gt; o...
在家远程办公效率低?那你一定要收好这个「在家办公」神器!
相信大家都已经收到国务院延长春节假期的消息,接下来,在家远程办公可能将会持续一段时间。 但是问题来了。远程办公不是人在电脑前就当坐班了,相反,对于沟通效率,文件协作,以及信息安全都有着极高的要求。有着非常多的挑战,比如: 1在异地互相不见面的会议上,如何提高沟通效率? 2文件之间的来往反馈如何做到及时性?如何保证信息安全? 3如何规划安排每天工作,以及如何进行成果验收? ...... ...
作为一个程序员,内存和磁盘的这些事情,你不得不知道啊!!!
截止目前,我已经分享了如下几篇文章: 一个程序在计算机中是如何运行的?超级干货!!! 作为一个程序员,CPU的这些硬核知识你必须会! 作为一个程序员,内存的这些硬核知识你必须懂! 这些知识可以说是我们之前都不太重视的基础知识,可能大家在上大学的时候都学习过了,但是嘞,当时由于老师讲解的没那么有趣,又加上这些知识本身就比较枯燥,所以嘞,大家当初几乎等于没学。 再说啦,学习这些,也看不出来有什么用啊!...
这个世界上人真的分三六九等,你信吗?
偶然间,在知乎上看到一个问题 一时间,勾起了我深深的回忆。 以前在厂里打过两次工,做过家教,干过辅导班,做过中介。零下几度的晚上,贴过广告,满脸、满手地长冻疮。 再回首那段岁月,虽然苦,但让我学会了坚持和忍耐。让我明白了,在这个世界上,无论环境多么的恶劣,只要心存希望,星星之火,亦可燎原。 下文是原回答,希望能对你能有所启发。 如果我说,这个世界上人真的分三六九等,...
2020年全新Java学习路线图,含配套视频,学完即为中级Java程序员!!
新的一年来临,突如其来的疫情打破了平静的生活! 在家的你是否很无聊,如果无聊就来学习吧! 世上只有一种投资只赚不赔,那就是学习!!! 传智播客于2020年升级了Java学习线路图,硬核升级,免费放送! 学完你就是中级程序员,能更快一步找到工作! 一、Java基础 JavaSE基础是Java中级程序员的起点,是帮助你从小白到懂得编程的必经之路。 在Java基础板块中有6个子模块的学...
B 站上有哪些很好的学习资源?
哇说起B站,在小九眼里就是宝藏般的存在,放年假宅在家时一天刷6、7个小时不在话下,更别提今年的跨年晚会,我简直是跪着看完的!! 最早大家聚在在B站是为了追番,再后来我在上面刷欧美新歌和漂亮小姐姐的舞蹈视频,最近两年我和周围的朋友们已经把B站当作学习教室了,而且学习成本还免费,真是个励志的好平台ヽ(.◕ฺˇд ˇ◕ฺ;)ノ 下面我们就来盘点一下B站上优质的学习资源: 综合类 Oeasy: 综合...
爬取薅羊毛网站百度云资源
这是疫情期间无聊做的爬虫, 去获取暂时用不上的教程 import threading import time import pandas as pd import requests import re from threading import Thread, Lock # import urllib.request as request # req=urllib.request.Requ...
如何优雅地打印一个Java对象?
你好呀,我是沉默王二,一个和黄家驹一样身高,和刘德华一样颜值的程序员。虽然已经写了十多年的 Java 代码,但仍然觉得自己是个菜鸟(请允许我惭愧一下)。 在一个月黑风高的夜晚,我思前想后,觉得再也不能这么蹉跎下去了。于是痛下决心,准备通过输出的方式倒逼输入,以此来修炼自己的内功,从而进阶成为一名真正意义上的大神。与此同时,希望这些文章能够帮助到更多的读者,让大家在学习的路上不再寂寞、空虚和冷。 ...
雷火神山直播超两亿,Web播放器事件监听是怎么实现的?
Web播放器解决了在手机浏览器和PC浏览器上播放音视频数据的问题,让视音频内容可以不依赖用户安装App,就能进行播放以及在社交平台进行传播。在视频业务大数据平台中,播放数据的统计分析非常重要,所以Web播放器在使用过程中,需要对其内部的数据进行收集并上报至服务端,此时,就需要对发生在其内部的一些播放行为进行事件监听。 那么Web播放器事件监听是怎么实现的呢? 01 监听事件明细表 名...
3万字总结,Mysql优化之精髓
本文知识点较多,篇幅较长,请耐心学习 MySQL已经成为时下关系型数据库产品的中坚力量,备受互联网大厂的青睐,出门面试想进BAT,想拿高工资,不会点MySQL优化知识,拿offer的成功率会大大下降。 为什么要优化 系统的吞吐量瓶颈往往出现在数据库的访问速度上 随着应用程序的运行,数据库的中的数据会越来越多,处理时间会相应变慢 数据是存放在磁盘上的,读写速度无法和内存相比 如何优化 设计...
HTML5适合的情人节礼物有纪念日期功能
前言 利用HTML5,css,js实现爱心树 以及 纪念日期的功能 网页有播放音乐功能 以及打字倾诉感情的画面,非常适合情人节送给女朋友 具体的HTML代码 具体只要修改代码里面的男某某和女某某 文字段也可自行修改,还有代码下半部分的JS代码需要修改一下起始日期 注意月份为0~11月 也就是月份需要减一。 当然只有一部分HTML和JS代码不够运行的,文章最下面还附加了完整代码的下载地址 &lt;!...
Python新型冠状病毒疫情数据自动爬取+统计+发送报告+数据屏幕(三)发送篇
今天介绍的项目是使用 Itchat 发送统计报告 项目功能设计: 定时爬取疫情数据存入Mysql 进行数据分析制作疫情报告 使用itchat给亲人朋友发送分析报告 基于Django做数据屏幕 使用Tableau做数据分析 来看看最终效果 目前已经完成,预计2月12日前更新 使用 itchat 发送数据统计报告 itchat 是一个基于 web微信的一个框架,但微信官方并不允许使用这...
python沙箱逃逸
沙箱逃逸是CTF和实际场景中经常遇到的一种情况。需要利用python的特性来实现逃逸。本文详细介绍了关于python逃逸的基础以及一些构造payload方法,并且附加习题提供练习。
作为程序员的我,大学四年一直自学,全靠这些实用工具和学习网站!
我本人因为高中沉迷于爱情,导致学业荒废,后来高考,毫无疑问进入了一所普普通通的大学,实在惭愧???? 我又是那么好强,现在学历不行,没办法改变的事情了,所以,进入大学开始,我就下定决心,一定要让自己掌握更多的技能,尤其选择了计算机这个行业,一定要多学习技术。 在进入大学学习不久后,我就认清了一个现实:我这个大学的整体教学质量和学习风气,真的一言难尽,懂的人自然知道怎么回事? 怎么办?我该如何更好的提升自...
新来个技术总监,禁止我们使用Lombok!
我有个学弟,在一家小型互联网公司做Java后端开发,最近他们公司新来了一个技术总监,这位技术总监对技术细节很看重,一来公司之后就推出了很多"政策",比如定义了很多开发规范、日志规范、甚至是要求大家统一使用某一款IDE。 但是这些都不是我这个学弟和我吐槽的点,他真正和我吐槽的是,他很不能理解,这位新来的技术总监竟然禁止公司内部所有开发使用Lombok。但是又没给出十分明确的,可以让人信服的理由。 于...
教你如何编写第一个简单的爬虫
很多人知道爬虫,也很想利用爬虫去爬取自己想要的数据,那么爬虫到底怎么用呢?今天就教大家编写一个简单的爬虫。 下面以爬取笔者的个人博客网站为例获取第一篇文章的标题名称,教大家学会一个简单的爬虫。 第一步:获取页面 #!/usr/bin/python # coding: utf-8 import requests #引入包requests link = "http://www.santostang....
相关热词 c# 压缩图片好麻烦 c#计算数组中的平均值 c#获取路由参数 c#日期精确到分钟 c#自定义异常必须继承 c#查表并返回值 c# 动态 表达式树 c# 监控方法耗时 c# listbox c#chart显示滚动条
立即提问