关于ShallowEtagHeaderFilter 大文件下载Out of memory问题解决
最近解决大文件下载的问题,遇到一个"Out of memory"的exception。建厂controller层的代码,发现是用BufferdOutputStream写入Response中的,缓冲区也只有8m,按理说不应该出现内存溢出的。
仔细观察异常堆栈,发现堆栈中出现了“ShallowEtagHeaderFilter ”这个拦截器。该拦截器是用来处理ETag信息的。关于Etag和该拦截器的介绍可参考:
摘自百度百科:
http://baike.baidu.com/view/3039264.htm
ShallowEtagHeaderFilter介绍:
http://blog.csdn.net/geloin/article/details/7445251
阅读代码发现问题就出在这个过滤器中,这个过滤器中会将Buffered流转换成ByteArray流写入Response。而ByteArrayOutputStream都存储内存中,还需要频繁的扩容。这在大文件下载的时候自然会内存溢出。
解决方案
考虑到大部分url还是需要该拦截器进行过滤的,只是需要排除掉跟文件下载相关的url。所以这里OneCoder决定复写该Filter,设置一个黑名单,复写其中的doFilterInternal方法,对于黑名单中的url都直接传递给下一个filter,否则super一下,继续走原来的逻辑。
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
/**
* The filter is used for resolving the big file download problem when using
* {@link ShallowEtagHeaderFilter}. The urls on the black list will be passed
* directly to the next filter in the chain, the others will be filtered as
* before.
* <p>
* Sample:<br>
* {@code <filter>}<br>
* &nbsp&nbsp&nbsp {@code<filter-name>BigFileEtagFilter</filter-name>}<br>
* &nbsp&nbsp&nbsp
* {@code<filter-class>com.coderli.filter.BigFileDownloadEtagHeaderFilter</filter-class>}<br>
* &nbsp&nbsp&nbsp {@code<!-- url sperators includes: blank space ; , and /r/n.
* Black list is optional.>}<br>
* &nbsp&nbsp&nbsp {@code<init-param>}<br>
* &nbsp&nbsp&nbsp&nbsp&nbsp&nbsp {@code<param-name>blackListURL</param-name>}<br>
* &nbsp&nbsp&nbsp&nbsp&nbsp&nbsp
* {@code <param-value> /aa /bb/** /cc/* </param-value>}<br>
* &nbsp&nbsp&nbsp {@code</init-param>}<br>
* {@code</filter>}<br>
* {@code<filter-mapping>}<br>
* &nbsp&nbsp&nbsp {@code<filter-name>BigFileEtagFilter</filter-name>}<br>
* &nbsp&nbsp&nbsp {@code<url-pattern>/*</url-pattern>}<br>
* {@code</filter-mapping>}
*
* @author li_hongzhe@nhn.com
* @date 2014-9-12 9:46:38
*/
public class BigFileDownloadEtagHeaderFilter extends ShallowEtagHeaderFilter {
private final String[] NULL_STRING_ARRAY = new String[0];
private final String URL_SPLIT_PATTERN = "[, ;\r\n]";
private final PathMatcher pathMatcher = new AntPathMatcher();
private final Logger logger = LoggerFactory
.getLogger(BigFileDownloadEtagHeaderFilter.class);
// url while list
// private String[] whiteListURLs = null;
// url black list
private String[] blackListURLs = null;
@Override
public final void initFilterBean() {
initConfig();
}
@Override
protected void doFilterInternal(HttpServletRequest request,
HttpServletResponse response, FilterChain filterChain)
throws ServletException, IOException {
String reqUrl = request.getPathInfo();
if (isBlackURL(reqUrl)) {
logger.debug("Current url {} is on the black list.", reqUrl);
filterChain.doFilter(request, response);
} else {
super.doFilterInternal(request, response, filterChain);
}
}
private void initConfig() {
// No need white list now.
// String whiteListURLStr = getFilterConfig().getInitParameter(
// "whiteListURL");
// whiteListURLs = strToArray(whiteListURLStr);
String blackListURLStr = getFilterConfig().getInitParameter(
"blackListURL");
blackListURLs = strToArray(blackListURLStr);
}
// No need white list now.
// private boolean isWhiteURL(String currentURL) {
// for (String whiteURL : whiteListURLs) {
// if (pathMatcher.match(whiteURL, currentURL)) {
// logger.debug(
// "url filter : white url list matches : [{}] match [{}] continue",
// whiteURL, currentURL);
// return true;
// }
// logger.debug(
// "url filter : white url list not matches : [{}] match [{}]",
// whiteURL, currentURL);
// }
// return false;
// }
private boolean isBlackURL(String currentURL) {
for (String blackURL : blackListURLs) {
if (pathMatcher.match(blackURL, currentURL)) {
logger.debug(
"url filter : black url list matches : [{}] match [{}] break",
blackURL, currentURL);
return true;
}
logger.debug(
"url filter : black url list not matches : [{}] match [{}]",
blackURL, currentURL);
}
return false;
}
private String[] strToArray(String urlStr) {
if (urlStr == null) {
return NULL_STRING_ARRAY;
}
String[] urlArray = urlStr.split(URL_SPLIT_PATTERN);
List<String> urlList = new ArrayList<String>();
for (String url : urlArray) {
url = url.trim();
if (url.length() == 0) {
continue;
}
urlList.add(url);
}
return urlList.toArray(NULL_STRING_ARRAY);
}
}
OneCoder也是参考网上有人提供的现成的样例,做了简单修改而已:http://jinnianshilongnian.iteye.com/blog/1663481
关于ShallowEtagHeaderFilter这个“Bug”,OneCoder发现网上也有人向spring反应了,不过好像Spring方面认为这是使用问题,不作为bug来进行处理,那我们就自己解决一下。