First publish on public
This commit is contained in:
parent
408385002d
commit
7487cd6f11
399
README.md
399
README.md
@ -4,9 +4,14 @@
|
||||
|
||||
## 📚 从本项目你能学到什么
|
||||
|
||||
- 学会主流的 Java Web 开发技术和框架
|
||||
- 积累一个真实的 Web 项目开发经验
|
||||
- 掌握本项目中涉及的常见面试题的答题策略
|
||||
- 学会主流的 Java Web 开发技术和框架(Spring、SpringBoot、Spring MVC、MyBatis、MySQL、Redis、Kafka、Elasticsearch 等)
|
||||
- 了解一个真实的 Web 项目从开发到部署的整个流程(本项目配套有大量图例和详细教程,以帮助小伙伴快速上手)
|
||||
- 掌握本项目中涉及的核心技术点以及常见面试题和解析
|
||||
|
||||
## 🏄 在线体验与文档地址
|
||||
|
||||
- 在线体验:项目已经部署到腾讯云服务器,各位小伙伴们可直接线上体验:[http://1.15.127.74/](http://1.15.127.74/)
|
||||
- 文档地址:文档通过 Vuepress + Gitee Pages 生成,在线访问地址:
|
||||
|
||||
## 💻 核心技术栈
|
||||
|
||||
@ -17,14 +22,14 @@
|
||||
- Spring MVC
|
||||
- ORM:MyBatis
|
||||
- 数据库:MySQL 5.7
|
||||
- 日志:SLF4J(日志接口) + Logback(日志实现)
|
||||
- 缓存:Redis
|
||||
- 分布式缓存:Redis
|
||||
- 本地缓存:Caffeine
|
||||
- 消息队列:Kafka 2.13-2.7.0
|
||||
- 搜索引擎:Elasticsearch 6.4.3
|
||||
- 安全:Spring Security
|
||||
- 邮件:Spring Mail
|
||||
- 分布式定时任务:Spring Quartz
|
||||
- 监控:Spring Actuator
|
||||
- 日志:SLF4J(日志接口) + Logback(日志实现)
|
||||
|
||||
前端:
|
||||
|
||||
@ -38,39 +43,89 @@
|
||||
- 操作系统:Windows 10
|
||||
- 构建工具:Apache Maven
|
||||
- 集成开发工具:Intellij IDEA
|
||||
- 数据库:MySQL 5.7
|
||||
- 应用服务器:Apache Tomcat
|
||||
- 接口测试工具:Postman
|
||||
- 压力测试工具:Apache JMeter
|
||||
- 版本控制工具:Git
|
||||
- Java 版本:8
|
||||
|
||||
## 🎀 界面展示
|
||||
|
||||
首页:
|
||||
|
||||

|
||||
|
||||
登录页:
|
||||
|
||||

|
||||
|
||||
帖子详情页:
|
||||
|
||||

|
||||
|
||||
个人主页:
|
||||
|
||||

|
||||
|
||||
朋友私信页:
|
||||
|
||||

|
||||
|
||||
私信详情页:
|
||||
|
||||

|
||||
|
||||
系统通知页:
|
||||
|
||||

|
||||
|
||||
通知详情页:
|
||||
|
||||

|
||||
|
||||
账号设置页:
|
||||
|
||||

|
||||
|
||||
数据统计页:
|
||||
|
||||

|
||||
|
||||
搜索详情页:
|
||||
|
||||

|
||||
|
||||
## 🎨 功能列表
|
||||
|
||||
- [x] **注册**(MySQL)
|
||||
|
||||

|
||||
|
||||
- [x] **注册**
|
||||
|
||||
- 用户注册成功,将用户信息存入 MySQL,但此时该用户状态为未激活
|
||||
- 向用户发送激活邮件,用户点击链接则激活账号(Spring Mail)
|
||||
|
||||
- [x] **登录 | 登出**(MySQL、Redis)
|
||||
|
||||
|
||||
- [x] **登录 | 登出**
|
||||
|
||||
- 进入登录界面,动态生成验证码,并将验证码短暂存入 Redis(60 秒)
|
||||
|
||||
|
||||
- 用户登录成功(验证用户名、密码、验证码),生成登录凭证且设置状态为有效,并将登录凭证存入 Redis
|
||||
|
||||
|
||||
注意:登录凭证存在有效期,在所有的请求执行之前,都会检查凭证是否有效和是否过期,只要该用户的凭证有效并在有效期时间内,本次请求就会一直持有该用户信息(使用 ThreadLocal 持有用户信息)
|
||||
|
||||
|
||||
- 勾选记住我,则延长登录凭证有效时间
|
||||
|
||||
|
||||
- 用户登录成功,将用户信息短暂存入 Redis(1 小时)
|
||||
|
||||
|
||||
- 用户登出,将凭证状态设为无效,并更新 Redis 中该用户的登录凭证信息
|
||||
|
||||
- [x] **账号设置**(MySQL)
|
||||
|
||||
|
||||
- [x] **账号设置**
|
||||
|
||||
- 修改头像
|
||||
- 将用户选择的头像图片文件上传至七牛云服务器
|
||||
- 修改密码
|
||||
|
||||
- [x] **帖子模块**(MySQL)
|
||||
|
||||
|
||||
- [x] **帖子模块**
|
||||
|
||||
- 发布帖子(过滤敏感词),将其存入 MySQL
|
||||
- 分页显示所有的帖子
|
||||
- 支持按照 “发帖时间” 显示
|
||||
@ -81,17 +136,17 @@
|
||||
- “版主” 可以看到帖子的置顶和加精按钮并执行相应操作
|
||||
- “管理员” 可以看到帖子的删除按钮并执行相应操作
|
||||
- “普通用户” 无法看到帖子的置顶、加精、删除按钮,也无法执行相应操作
|
||||
|
||||
- [x] **评论模块**(MySQL)
|
||||
|
||||
|
||||
- [x] **评论模块**
|
||||
|
||||
- 发布对帖子的评论(过滤敏感词),将其存入 MySQL
|
||||
- 分页显示评论
|
||||
- 发布对评论的回复(过滤敏感词)
|
||||
- 权限管理(Spring Security)
|
||||
- 未登录用户无法使用评论功能
|
||||
|
||||
- [x] **私信模块**(MySQL)
|
||||
|
||||
|
||||
- [x] **私信模块**
|
||||
|
||||
- 发送私信(过滤敏感词)
|
||||
- 私信列表
|
||||
- 查询当前用户的会话列表
|
||||
@ -103,33 +158,38 @@
|
||||
- 支持分页显示
|
||||
- 权限管理(Spring Security)
|
||||
- 未登录用户无法使用私信功能
|
||||
|
||||
- [x] **统一处理异常**(404、500)
|
||||
|
||||
|
||||
- [x] **统一处理 404 / 500 异常**
|
||||
|
||||
- 普通请求异常
|
||||
- 异步请求异常
|
||||
|
||||
|
||||
- [x] **统一记录日志**
|
||||
|
||||
- [x] **点赞模块**(Redis)
|
||||
|
||||
- 点赞
|
||||
- 获赞
|
||||
- [x] **点赞模块**
|
||||
|
||||
- 支持对帖子、评论/回复点赞
|
||||
- 第 1 次点赞,第 2 次取消点赞
|
||||
- 首页统计帖子的点赞数量
|
||||
- 详情页统计帖子和评论/回复的点赞数量
|
||||
- 详情页显示当前登录用户的点赞状态(赞过了则显示已赞)
|
||||
|
||||
- 统计我的获赞数量
|
||||
- 权限管理(Spring Security)
|
||||
- 未登录用户无法使用点赞相关功能
|
||||
|
||||
- [x] **关注模块**(Redis)
|
||||
|
||||
|
||||
- [x] **关注模块**
|
||||
|
||||
- 关注功能
|
||||
- 取消关注功能
|
||||
- 统计用户的关注数和粉丝数
|
||||
- 关注列表(查询某个用户关注的人),支持分页
|
||||
- 粉丝列表(查询某个用户的粉丝),支持分页
|
||||
- 我的关注列表(查询某个用户关注的人),支持分页
|
||||
- 我的粉丝列表(查询某个用户的粉丝),支持分页
|
||||
- 权限管理(Spring Security)
|
||||
- 未登录用户无法使用关注相关功能
|
||||
|
||||
- [x] **系统通知模块**(Kafka)
|
||||
|
||||
|
||||
- [x] **系统通知模块**
|
||||
|
||||
- 通知列表
|
||||
- 显示评论、点赞、关注三种类型的通知
|
||||
- 通知详情
|
||||
@ -141,8 +201,8 @@
|
||||
- 导航栏显示所有消息的未读数量(未读私信 + 未读系统通知)
|
||||
- 权限管理(Spring Security)
|
||||
- 未登录用户无法使用系统通知功能
|
||||
|
||||
- [x] **搜索模块**(Elasticsearch + Kafka)
|
||||
|
||||
- [x] **搜索模块**
|
||||
|
||||
- 发布事件
|
||||
- 发布帖子时,通过消息队列将帖子异步地提交到 Elasticsearch 服务器
|
||||
@ -163,20 +223,59 @@
|
||||
- 权限管理(Spring Security)
|
||||
- 只有管理员可以查看网站数据统计
|
||||
|
||||
- [ ] 文件上传
|
||||
- [x] 优化网站性能
|
||||
|
||||
- [ ] 优化网站性能
|
||||
- 使用本地缓存 Caffeine 缓存热帖列表以及所有用户帖子的总数
|
||||
|
||||
## 🔐 待实现及优化
|
||||
|
||||
- [ ] 修改用户名
|
||||
- [ ] 查询我的帖子
|
||||
- [ ] 查询我的评论
|
||||
以下是我个人发现的本项目存在的问题,但是暂时没有头绪无法解决,集思广益,欢迎各位小伙伴提 PR 解决:
|
||||
|
||||
- [ ] 注册模块无法正常跳转到操作提示界面(本地运行没有问题)
|
||||
- [ ] 评论功能的前端显示部分存在 Bug
|
||||
- [ ] 查询我的评论(未完善)
|
||||
|
||||
以下是我觉得本项目还可以添加的功能,同样欢迎各位小伙伴提 issue 指出还可以增加哪些功能,或者直接提 PR 实现该功能:
|
||||
|
||||
- [ ] 忘记密码(发送邮件找回密码)
|
||||
- [ ] 查询我的点赞
|
||||
- [ ] 管理员对帖子的二次点击取消置顶功能
|
||||
- [ ] 管理员对已删除帖子的恢复功能
|
||||
- [ ] 管理员对已删除帖子的恢复功能(本项目中的删除帖子并未将其从数据库中删除,只是将其状态设置为了拉黑)
|
||||
|
||||
## 🎀 界面展示
|
||||
## 🌱 本地运行
|
||||
|
||||
各位如果需要将项目部署在本地进行测试,以下环境请提前备好:
|
||||
|
||||
- Java 8
|
||||
- MySQL 5.7
|
||||
- Redis
|
||||
- Kafka 2.13-2.7.0
|
||||
- Elasticsearch 6.4.3
|
||||
|
||||
然后**修改配置文件中的信息为你自己的本地环境,直接运行是运行不了的**,而且相关私密信息我全部用 xxxxxxx 代替了。
|
||||
|
||||
本地运行需要修改的配置文件信息如下:
|
||||
|
||||
1)`application-develop.properties`:
|
||||
|
||||
- MySQL
|
||||
- Spring Mail(邮箱需要开启 SMTP 服务)
|
||||
- Kafka:consumer.group-id(该字段见 Kafka 安装包中的 consumer.proerties,可自行修改, 修改完毕后需要重启 Kafka)
|
||||
- Elasticsearch:cluster-name(该字段见 Elasticsearch 安装包中的 elasticsearch.yml,可自行修改)
|
||||
- 七牛云(需要新建一个七牛云的对象存储空间,用来存放上传的头像图片)
|
||||
|
||||
2)`logback-spring-develop.xml`:
|
||||
|
||||
- LOG_PATH:日志存放的位置
|
||||
|
||||
每次运行需要打开:
|
||||
|
||||
- MySQL
|
||||
- Redis
|
||||
- Elasticsearch
|
||||
- Kafka
|
||||
|
||||
另外,还需要事件建好数据库表,详细见下文。
|
||||
|
||||
## 📜 数据库设计
|
||||
|
||||
@ -240,22 +339,6 @@ CREATE TABLE `comment` (
|
||||
) ENGINE=InnoDB AUTO_INCREMENT=247 DEFAULT CHARSET=utf8;
|
||||
```
|
||||
|
||||
登录凭证 `login_ticket`(废弃,使用 Redis 存储):
|
||||
|
||||
```sql
|
||||
DROP TABLE IF EXISTS `login_ticket`;
|
||||
SET character_set_client = utf8mb4 ;
|
||||
CREATE TABLE `login_ticket` (
|
||||
`id` int(11) NOT NULL AUTO_INCREMENT,
|
||||
`user_id` int(11) NOT NULL,
|
||||
`ticket` varchar(45) NOT NULL COMMENT '凭证',
|
||||
`status` int(11) DEFAULT '0' COMMENT '凭证状态:0-有效; 1-无效;',
|
||||
`expired` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP COMMENT '凭证到期时间',
|
||||
PRIMARY KEY (`id`),
|
||||
KEY `index_ticket` (`ticket`(20))
|
||||
) ENGINE=InnoDB DEFAULT CHARSET=utf8;
|
||||
```
|
||||
|
||||
私信 `message`:
|
||||
|
||||
```sql
|
||||
@ -276,5 +359,179 @@ CREATE TABLE `message` (
|
||||
) ENGINE=InnoDB DEFAULT CHARSET=utf8;
|
||||
```
|
||||
|
||||
## 📖 常见面试题
|
||||
## 🌌 理想的部署架构
|
||||
|
||||
我每个都只部署了一台,以下是理想的部署架构:
|
||||
|
||||

|
||||
|
||||
## 🎯 功能逻辑图
|
||||
|
||||
画了一些不是那么严谨的图帮助各位小伙伴理清思绪。
|
||||
|
||||
> 单向绿色箭头:
|
||||
>
|
||||
> - 前端模板 -> Controller:表示这个前端模板中有一个超链接是由这个 Controller 处理的
|
||||
> - Controller -> 前端模板:表示这个 Controller 会像该前端模板传递数据或者跳转
|
||||
>
|
||||
> 双向绿色箭头:表示 Controller 和前端模板之间进行参数的相互传递或使用
|
||||
>
|
||||
> 单向蓝色箭头: A -> B,表示 A 方法调用了 B 方法
|
||||
>
|
||||
> 单向红色箭头:数据库或缓存操作
|
||||
|
||||
### 注册
|
||||
|
||||
- 用户注册成功,将用户信息存入 MySQL,但此时该用户状态为未激活
|
||||
- 向用户发送激活邮件,用户点击链接则激活账号(Spring Mail)
|
||||
|
||||

|
||||
|
||||
### 登录 | 登出
|
||||
|
||||
- 进入登录界面,动态生成验证码,并将验证码短暂存入 Redis(60 秒)
|
||||
|
||||
- 用户登录成功(验证用户名、密码、验证码),生成登录凭证且设置状态为有效,并将登录凭证存入 Redis
|
||||
|
||||
注意:登录凭证存在有效期,在所有的请求执行之前,都会检查凭证是否有效和是否过期,只要该用户的凭证有效并在有效期时间内,本次请求就会一直持有该用户信息(使用 ThreadLocal 持有用户信息)
|
||||
|
||||
- 勾选记住我,则延长登录凭证有效时间
|
||||
|
||||
- 用户登录成功,将用户信息短暂存入 Redis(1 小时)
|
||||
|
||||
- 用户登出,将凭证状态设为无效,并更新 Redis 中该用户的登录凭证信息
|
||||
|
||||
下图是登录模块的功能逻辑图,并没有使用 Spring Security 提供的认证逻辑(我觉得这个模块是最复杂的,这张图其实很多细节还没有画全)
|
||||
|
||||

|
||||
|
||||
### 分页显示所有的帖子
|
||||
|
||||
- 支持按照 “发帖时间” 显示
|
||||
- 支持按照 “热度排行” 显示(Spring Quartz)
|
||||
- 将热帖列表和所有帖子的总数存入本地缓存 Caffeine(利用分布式定时任务 Spring Quartz 每隔一段时间就刷新计算帖子的热度/分数 — 见下文,而 Caffeine 里的数据更新不用我们操心,它天生就会自动的更新它拥有的数据,给它一个初始化方法就完事儿)
|
||||
|
||||

|
||||
|
||||
|
||||
|
||||
### 账号设置
|
||||
|
||||
- 修改头像(异步请求)
|
||||
- 将用户选择的头像图片文件上传至七牛云服务器
|
||||
- 修改密码
|
||||
|
||||
此处只画出修改头像:
|
||||
|
||||

|
||||
|
||||
### 发布帖子(异步请求)
|
||||
|
||||

|
||||
|
||||
### 显示评论及相关信息
|
||||
|
||||
> 评论部分前端的名称显示有些缺陷,有兴趣的小伙伴欢迎提 PR 解决~
|
||||
|
||||
关于评论模块需要注意的就是评论表的设计,把握其中字段的含义,才能透彻了解这个功能的逻辑。
|
||||
|
||||
评论 Comment 的目标类型(帖子,评论) entityType 和 entityId 以及对哪个用户进行评论/回复 targetId 是由前端传递给 DiscussPostController 的
|
||||
|
||||

|
||||
|
||||
一个帖子的详情页需要封装的信息大概如下:
|
||||
|
||||

|
||||
|
||||
### 添加评论(事务管理)
|
||||
|
||||

|
||||
|
||||
### 私信列表和详情页
|
||||
|
||||

|
||||
|
||||
### 发送私信(异步请求)
|
||||
|
||||

|
||||
|
||||
### 点赞(异步请求)
|
||||
|
||||
将点赞相关信息存入 Redis 的数据结构 set 中。其中,key 命名为 `like:entity:entityType:entityId`,value 即点赞用户的 id。比如 key = `like:entity:2:246` value = `11` 表示用户 11 对实体类型 2 即评论进行了点赞,该评论的 id 是 246
|
||||
|
||||
某个用户的获赞数量对应的存储在 Redis 中的 key 是 `like:user:userId`,value 就是这个用户的获赞数量
|
||||
|
||||

|
||||
|
||||
### 我的获赞数量
|
||||
|
||||

|
||||
|
||||
### 关注(异步请求)
|
||||
|
||||
- 若 A 关注了 B,则 A 是 B 的粉丝 Follower,B 是 A 的目标 Followee
|
||||
- 关注的目标可以是用户、帖子、题目等,在实现时将这些目标抽象为实体(目前只做了关注用户)
|
||||
|
||||
将某个用户关注的实体相关信息存储在 Redis 的数据结构 zset 中:key 是 `followee:userId:entityType` ,对应的 value 是 `zset(entityId, now)` ,以关注的时间进行排序。比如说 `followee:111:3` 对应的value `(20, 2020-02-03-xxxx)`,表明用户 111 关注了实体类型为 3 即人(用户),该帖子的 id 是 20,关注该帖子的时间是 2020-02-03-xxxx
|
||||
|
||||
同样的,将某个实体拥有的粉丝相关信息也存储在 Redis 的数据结构 zset 中:key 是 `follower:entityType:entityId`,对应的 value 是 `zset(userId, now)`,以关注的时间进行排序
|
||||
|
||||

|
||||
|
||||
### 关注列表
|
||||
|
||||

|
||||
|
||||
### 发送系统通知
|
||||
|
||||

|
||||
|
||||
### 显示系统通知
|
||||
|
||||

|
||||
|
||||
### 搜索
|
||||
|
||||

|
||||
|
||||
类似的,置顶、加精也会触发发帖事件,就不再图里面画出来了。
|
||||
|
||||
### 置顶加精删除(异步请求)
|
||||
|
||||

|
||||
|
||||
### 网站数据统计
|
||||
|
||||

|
||||
|
||||
### 帖子热度计算
|
||||
|
||||
每次发生点赞(给帖子点赞)、评论(给帖子评论)、加精的时候,就将这些帖子信息存入缓存 Redis 中,然后通过分布式的定时任务 Spring Quartz,每隔一段时间就从缓存中取出这些帖子进行计算分数。
|
||||
|
||||
帖子分数/热度计算公式:分数(热度) = 权重 + 发帖距离天数
|
||||
|
||||
```java
|
||||
// 计算权重
|
||||
double w = (wonderful ? 75 : 0) + commentCount * 10 + likeCount * 2;
|
||||
// 分数 = 权重 + 发帖距离天数
|
||||
double score = Math.log10(Math.max(w, 1))
|
||||
+ (post.getCreateTime().getTime() - epoch.getTime()) / (1000 * 3600 * 24);
|
||||
```
|
||||
|
||||

|
||||
|
||||
## 📖 配套教程
|
||||
|
||||
想要自己从零开始实现这个项目或者深入理解的小伙伴,可以扫描下方二维码关注公众号『**飞天小牛肉**』,第一时间获取配套教程, 不仅会详细解释本项目涉及的各大技术点,还会汇总相关的常见面试题,目前尚在更新中。
|
||||
|
||||
<img src="https://gitee.com/veal98/images/raw/master/img/20210204145531.png" style="zoom:67%;" />
|
||||
|
||||
## 📞 联系我
|
||||
|
||||
有什么问题也可以添加我的微信,记得备注来意:格式 <u>(学校或公司 - 姓名或昵称 - 来意)</u>
|
||||
|
||||
<img width="260px" src="https://gitee.com/veal98/images/raw/master/img/微信图片_20210105121328.jpg" >
|
||||
|
||||
## 👏 鸣谢
|
||||
|
||||
本项目参考[牛客网](https://www.nowcoder.com/) — Java 高级工程师课程,感谢老师和平台
|
21
docs/.vuepress/config.js
Normal file
21
docs/.vuepress/config.js
Normal file
@ -0,0 +1,21 @@
|
||||
module.exports = {
|
||||
title: '开源社区系统 — Echo', // 设置网站标题
|
||||
description : '一款基于 SpringBoot + MyBatis + MySQL + Redis + Kafka + Elasticsearch + ... 实现的开源社区系统,并提供详细的开发文档和配套教程',
|
||||
themeConfig : {
|
||||
nav : [
|
||||
{
|
||||
text: '仓库地址',
|
||||
items: [
|
||||
{ text: 'Github', link: 'https://github.com/Veal98/Echo' },
|
||||
{ text: 'Gitee', link: 'https://gitee.com/veal98/echo' }
|
||||
]
|
||||
},
|
||||
{ text: '体验项目', link: 'http://1.15.127.74:8080/' },
|
||||
{ text: '配套教程', link: '/error' }
|
||||
],
|
||||
sidebar: 'auto', // 侧边栏配置
|
||||
sidebarDepth : 2,
|
||||
lastUpdated: 'Last Updated', // string | boolean
|
||||
|
||||
}
|
||||
}
|
20
docs/.vuepress/dist/404.html
vendored
Normal file
20
docs/.vuepress/dist/404.html
vendored
Normal file
@ -0,0 +1,20 @@
|
||||
<!DOCTYPE html>
|
||||
<html lang="en-US">
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width,initial-scale=1">
|
||||
<title>VuePress</title>
|
||||
<meta name="generator" content="VuePress 1.8.0">
|
||||
|
||||
<meta name="description" content="">
|
||||
|
||||
<link rel="preload" href="/assets/css/0.styles.d39084ab.css" as="style"><link rel="preload" href="/assets/js/app.b571f5e2.js" as="script"><link rel="preload" href="/assets/js/6.6e8c67be.js" as="script"><link rel="prefetch" href="/assets/js/2.12ad9e69.js"><link rel="prefetch" href="/assets/js/3.9f41a7be.js"><link rel="prefetch" href="/assets/js/4.865692a8.js"><link rel="prefetch" href="/assets/js/5.81254301.js"><link rel="prefetch" href="/assets/js/7.0bbea29c.js">
|
||||
<link rel="stylesheet" href="/assets/css/0.styles.d39084ab.css">
|
||||
</head>
|
||||
<body>
|
||||
<div id="app" data-server-rendered="true"><div class="theme-container"><div class="theme-default-content"><h1>404</h1> <blockquote>How did we get here?</blockquote> <a href="/" class="router-link-active">
|
||||
Take me home.
|
||||
</a></div></div><div class="global-ui"></div></div>
|
||||
<script src="/assets/js/app.b571f5e2.js" defer></script><script src="/assets/js/6.6e8c67be.js" defer></script>
|
||||
</body>
|
||||
</html>
|
1
docs/.vuepress/dist/assets/css/0.styles.d39084ab.css
vendored
Normal file
1
docs/.vuepress/dist/assets/css/0.styles.d39084ab.css
vendored
Normal file
File diff suppressed because one or more lines are too long
1
docs/.vuepress/dist/assets/img/search.83621669.svg
vendored
Normal file
1
docs/.vuepress/dist/assets/img/search.83621669.svg
vendored
Normal file
@ -0,0 +1 @@
|
||||
<?xml version="1.0" encoding="UTF-8"?><svg xmlns="http://www.w3.org/2000/svg" width="12" height="13"><g stroke-width="2" stroke="#aaa" fill="none"><path d="M11.29 11.71l-4-4"/><circle cx="5" cy="5" r="4"/></g></svg>
|
After Width: | Height: | Size: 216 B |
1
docs/.vuepress/dist/assets/js/2.12ad9e69.js
vendored
Normal file
1
docs/.vuepress/dist/assets/js/2.12ad9e69.js
vendored
Normal file
File diff suppressed because one or more lines are too long
1
docs/.vuepress/dist/assets/js/3.9f41a7be.js
vendored
Normal file
1
docs/.vuepress/dist/assets/js/3.9f41a7be.js
vendored
Normal file
@ -0,0 +1 @@
|
||||
(window.webpackJsonp=window.webpackJsonp||[]).push([[3],{328:function(t,e,n){},356:function(t,e,n){"use strict";n(328)},364:function(t,e,n){"use strict";n.r(e);var i={functional:!0,props:{type:{type:String,default:"tip"},text:String,vertical:{type:String,default:"top"}},render:function(t,e){var n=e.props,i=e.slots;return t("span",{class:["badge",n.type],style:{verticalAlign:n.vertical}},n.text||i().default)}},r=(n(356),n(42)),p=Object(r.a)(i,void 0,void 0,!1,null,"15b7b770",null);e.default=p.exports}}]);
|
1
docs/.vuepress/dist/assets/js/4.865692a8.js
vendored
Normal file
1
docs/.vuepress/dist/assets/js/4.865692a8.js
vendored
Normal file
@ -0,0 +1 @@
|
||||
(window.webpackJsonp=window.webpackJsonp||[]).push([[4],{329:function(e,t,c){},357:function(e,t,c){"use strict";c(329)},361:function(e,t,c){"use strict";c.r(t);var i={name:"CodeBlock",props:{title:{type:String,required:!0},active:{type:Boolean,default:!1}}},n=(c(357),c(42)),s=Object(n.a)(i,(function(){var e=this.$createElement;return(this._self._c||e)("div",{staticClass:"theme-code-block",class:{"theme-code-block__active":this.active}},[this._t("default")],2)}),[],!1,null,"6d04095e",null);t.default=s.exports}}]);
|
1
docs/.vuepress/dist/assets/js/5.81254301.js
vendored
Normal file
1
docs/.vuepress/dist/assets/js/5.81254301.js
vendored
Normal file
@ -0,0 +1 @@
|
||||
(window.webpackJsonp=window.webpackJsonp||[]).push([[5],{330:function(e,t,o){},358:function(e,t,o){"use strict";o(330)},362:function(e,t,o){"use strict";o.r(t);o(23),o(93),o(65),o(95);var a={name:"CodeGroup",data:function(){return{codeTabs:[],activeCodeTabIndex:-1}},watch:{activeCodeTabIndex:function(e){this.codeTabs.forEach((function(e){e.elm.classList.remove("theme-code-block__active")})),this.codeTabs[e].elm.classList.add("theme-code-block__active")}},mounted:function(){var e=this;this.codeTabs=(this.$slots.default||[]).filter((function(e){return Boolean(e.componentOptions)})).map((function(t,o){return""===t.componentOptions.propsData.active&&(e.activeCodeTabIndex=o),{title:t.componentOptions.propsData.title,elm:t.elm}})),-1===this.activeCodeTabIndex&&this.codeTabs.length>0&&(this.activeCodeTabIndex=0)},methods:{changeCodeTab:function(e){this.activeCodeTabIndex=e}}},c=(o(358),o(42)),n=Object(c.a)(a,(function(){var e=this,t=e.$createElement,o=e._self._c||t;return o("div",{staticClass:"theme-code-group"},[o("div",{staticClass:"theme-code-group__nav"},[o("ul",{staticClass:"theme-code-group__ul"},e._l(e.codeTabs,(function(t,a){return o("li",{key:t.title,staticClass:"theme-code-group__li"},[o("button",{staticClass:"theme-code-group__nav-tab",class:{"theme-code-group__nav-tab-active":a===e.activeCodeTabIndex},on:{click:function(t){return e.changeCodeTab(a)}}},[e._v("\n "+e._s(t.title)+"\n ")])])})),0)]),e._v(" "),e._t("default"),e._v(" "),e.codeTabs.length<1?o("pre",{staticClass:"pre-blank"},[e._v("// Make sure to add code blocks to your code group")]):e._e()],2)}),[],!1,null,"32c2d7ed",null);t.default=n.exports}}]);
|
1
docs/.vuepress/dist/assets/js/6.6e8c67be.js
vendored
Normal file
1
docs/.vuepress/dist/assets/js/6.6e8c67be.js
vendored
Normal file
@ -0,0 +1 @@
|
||||
(window.webpackJsonp=window.webpackJsonp||[]).push([[6],{360:function(t,e,s){"use strict";s.r(e);var n=["There's nothing here.","How did we get here?","That's a Four-Oh-Four.","Looks like we've got some broken links."],o={methods:{getMsg:function(){return n[Math.floor(Math.random()*n.length)]}}},i=s(42),h=Object(i.a)(o,(function(){var t=this.$createElement,e=this._self._c||t;return e("div",{staticClass:"theme-container"},[e("div",{staticClass:"theme-default-content"},[e("h1",[this._v("404")]),this._v(" "),e("blockquote",[this._v(this._s(this.getMsg()))]),this._v(" "),e("RouterLink",{attrs:{to:"/"}},[this._v("\n Take me home.\n ")])],1)])}),[],!1,null,null,null);e.default=h.exports}}]);
|
1
docs/.vuepress/dist/assets/js/7.0bbea29c.js
vendored
Normal file
1
docs/.vuepress/dist/assets/js/7.0bbea29c.js
vendored
Normal file
File diff suppressed because one or more lines are too long
13
docs/.vuepress/dist/assets/js/app.b571f5e2.js
vendored
Normal file
13
docs/.vuepress/dist/assets/js/app.b571f5e2.js
vendored
Normal file
File diff suppressed because one or more lines are too long
@ -1,411 +0,0 @@
|
||||
# 开发社区首页
|
||||
|
||||
---
|
||||
|
||||

|
||||
|
||||
## DiscussPost 讨论帖
|
||||
|
||||
### Entity
|
||||
|
||||
### DAO
|
||||
|
||||
- Mapper 接口 `DiscussPostMapper`
|
||||
- 对应的 xml 配置文件 `discusspost-mapper.xml`
|
||||
|
||||
`@Param` 注解用于给参数起别名,**如果只有一个参数**,并且需要在 `<if>` 里使用,则必须加别名
|
||||
|
||||
```java
|
||||
@Mapper
|
||||
public interface DiscussPostMapper {
|
||||
|
||||
/**
|
||||
* 分页查询讨论贴信息
|
||||
*
|
||||
* @param userId 当传入的 userId = 0 时查找所有用户的帖子
|
||||
* 当传入的 userId != 0 时,查找该指定用户的帖子
|
||||
* @param offset 每页的起始索引
|
||||
* @param limit 每页显示多少条数据
|
||||
* @return
|
||||
*/
|
||||
List<DiscussPost> selectDiscussPosts(int userId, int offset, int limit);
|
||||
|
||||
/**
|
||||
* 查询讨论贴的个数
|
||||
* @param userId 当传入的 userId = 0 时计算所有用户的帖子总数
|
||||
* 当传入的 userId != 0 时计算该指定用户的帖子总数
|
||||
* @return
|
||||
*/
|
||||
int selectDiscussPostRows(@Param("userId") int userId);
|
||||
}
|
||||
```
|
||||
|
||||
对应的 Mapper:
|
||||
|
||||
```xml
|
||||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<!DOCTYPE mapper
|
||||
PUBLIC "-//mybatis.org//DTD Mapper 3.0//EN"
|
||||
"http://mybatis.org/dtd/mybatis-3-mapper.dtd">
|
||||
<mapper namespace="com.greate.community.dao.DiscussPostMapper">
|
||||
|
||||
<sql id = "selectFields">
|
||||
id, user_id, title, content, type, status, create_time, comment_count, score
|
||||
</sql>
|
||||
|
||||
<!--分页查询讨论贴信息-->
|
||||
<!--不显示拉黑的帖子, 按照是否置顶和创建时间排序-->
|
||||
<select id = "selectDiscussPosts" resultType="DiscussPost">
|
||||
select <include refid="selectFields"></include>
|
||||
from discuss_post
|
||||
where status != 2
|
||||
<if test = "userId!=0">
|
||||
and user_id = #{userId}
|
||||
</if>
|
||||
order by type desc, create_time desc
|
||||
limit #{offset}, #{limit}
|
||||
</select>
|
||||
|
||||
<!--查询讨论贴的个数-->
|
||||
<select id = "selectDiscussPostRows" resultType="int">
|
||||
select count(id)
|
||||
from discuss_post
|
||||
where status != 2
|
||||
<if test = "userId != 0">
|
||||
and user_id = #{userId}
|
||||
</if>
|
||||
</select>
|
||||
|
||||
</mapper>
|
||||
```
|
||||
|
||||
### Service
|
||||
|
||||
关于自动注入 Mapper 报错问题:可参考 [关于IDEA中@Autowired 注解报错~图文](https://www.cnblogs.com/taopanfeng/p/10994075.html)
|
||||
|
||||
```java
|
||||
@Service
|
||||
public class DiscussPostSerivce {
|
||||
|
||||
@Autowired
|
||||
private DiscussPostMapper discussPostMapper;
|
||||
|
||||
|
||||
/**
|
||||
* 分页查询讨论帖信息
|
||||
*
|
||||
* @param userId 当传入的 userId = 0 时查找所有用户的帖子
|
||||
* 当传入的 userId != 0 时,查找该指定用户的帖子
|
||||
* @param offset 每页的起始索引
|
||||
* @param limit 每页显示多少条数据
|
||||
* @return
|
||||
*/
|
||||
public List<DiscussPost> findDiscussPosts (int userId, int offset, int limit) {
|
||||
return discussPostMapper.selectDiscussPosts(userId, offset, limit);
|
||||
}
|
||||
|
||||
/**
|
||||
* 查询讨论贴的个数
|
||||
* @param userId 当传入的 userId = 0 时计算所有用户的帖子总数
|
||||
* 当传入的 userId != 0 时计算该指定用户的帖子总数
|
||||
* @return
|
||||
*/
|
||||
public int findDiscussPostRows (int userId) {
|
||||
return discussPostMapper.selectDiscussPostRows(userId);
|
||||
}
|
||||
|
||||
}
|
||||
```
|
||||
|
||||
## User
|
||||
|
||||
### Entity
|
||||
|
||||
### DAO
|
||||
|
||||
```java
|
||||
@Mapper
|
||||
public interface UserMapper {
|
||||
|
||||
/**
|
||||
* 根据 id 查询用户
|
||||
* @param id
|
||||
* @return
|
||||
*/
|
||||
User selectById (int id);
|
||||
|
||||
/**
|
||||
* 根据 username 查询用户
|
||||
* @param username
|
||||
* @return
|
||||
*/
|
||||
User selectByName(String username);
|
||||
|
||||
/**
|
||||
* 根据 email 查询用户
|
||||
* @param email
|
||||
* @return
|
||||
*/
|
||||
User selectByEmail(String email);
|
||||
|
||||
/**
|
||||
* 插入用户(注册)
|
||||
* @param user
|
||||
* @return
|
||||
*/
|
||||
int insertUser(User user);
|
||||
|
||||
/**
|
||||
* 修改用户状态
|
||||
* @param id
|
||||
* @param status 0:未激活,1:已激活
|
||||
* @return
|
||||
*/
|
||||
int updateStatus(int id, int status);
|
||||
|
||||
/**
|
||||
* 修改头像
|
||||
* @param id
|
||||
* @param headerUrl
|
||||
* @return
|
||||
*/
|
||||
int updateHeader(int id, String headerUrl);
|
||||
|
||||
/**
|
||||
* 修改密码
|
||||
* @param id
|
||||
* @param password
|
||||
* @return
|
||||
*/
|
||||
int updatePassword(int id, String password);
|
||||
|
||||
}
|
||||
```
|
||||
|
||||
对应的 mapper.xml:
|
||||
|
||||
```xml
|
||||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<!DOCTYPE mapper
|
||||
PUBLIC "-//mybatis.org//DTD Mapper 3.0//EN"
|
||||
"http://mybatis.org/dtd/mybatis-3-mapper.dtd">
|
||||
<mapper namespace="com.greate.community.dao.UserMapper">
|
||||
|
||||
<sql id = "insertFields">
|
||||
username, password, salt, email, type, status, activation_code, header_url, create_time
|
||||
</sql>
|
||||
|
||||
<sql id = "selectFields">
|
||||
id, username, password, salt, email, type, status, activation_code, header_url, create_time
|
||||
</sql>
|
||||
|
||||
<!--根据 Id 查询用户信息-->
|
||||
<select id = "selectById" resultType = "User">
|
||||
select <include refid="selectFields"></include>
|
||||
from user
|
||||
where id = #{id}
|
||||
</select>
|
||||
|
||||
<!--根据 Username 查询用户信息-->
|
||||
<select id="selectByName" resultType="User">
|
||||
select <include refid="selectFields"></include>
|
||||
from user
|
||||
where username = #{username}
|
||||
</select>
|
||||
|
||||
<!--根据 email 查询用户信息-->
|
||||
<select id="selectByEmail" resultType="User">
|
||||
select <include refid="selectFields"></include>
|
||||
from user
|
||||
where email = #{email}
|
||||
</select>
|
||||
|
||||
<!--插入用户信息(注册)-->
|
||||
<insert id="insertUser" parameterType="User" keyProperty="id">
|
||||
insert into user (<include refid="insertFields"></include>)
|
||||
values(#{username}, #{password}, #{salt}, #{email}, #{type}, #{status}, #{activationCode}, #{headerUrl}, #{createTime})
|
||||
</insert>
|
||||
|
||||
<!--修改用户状态-->
|
||||
<update id="updateStatus">
|
||||
update user set status = #{status} where id = #{id}
|
||||
</update>
|
||||
|
||||
<!--修改用户头像-->
|
||||
<update id="updateHeader">
|
||||
update user set header_url = #{headerUrl} where id = #{id}
|
||||
</update>
|
||||
|
||||
<!--修改密码-->
|
||||
<update id="updatePassword">
|
||||
update user set password = #{password} where id = #{id}
|
||||
</update>
|
||||
|
||||
</mapper>
|
||||
```
|
||||
|
||||
### Service
|
||||
|
||||
```java
|
||||
@Service
|
||||
public class UserService {
|
||||
|
||||
@Autowired
|
||||
private UserMapper userMapper;
|
||||
|
||||
public User findUserById (int id) {
|
||||
return userMapper.selectById(id);
|
||||
}
|
||||
|
||||
}
|
||||
```
|
||||
|
||||
## Page 分页
|
||||
|
||||
```java
|
||||
/**
|
||||
* 封装分页相关的信息
|
||||
*/
|
||||
public class Page {
|
||||
|
||||
// 当前的页码
|
||||
private int current = 1;
|
||||
// 单页显示的帖子数量上限
|
||||
private int limit = 10;
|
||||
// 帖子总数(用于计算总页数)
|
||||
private int rows;
|
||||
// 查询路径(用于复用分页链接, 因为我们不只在首页中有分页,其他界面也会有分页)
|
||||
private String path;
|
||||
|
||||
public int getCurrent() {
|
||||
return current;
|
||||
}
|
||||
|
||||
public void setCurrent(int current) {
|
||||
if (current >= 1) {
|
||||
this.current = current;
|
||||
}
|
||||
}
|
||||
|
||||
public int getLimit() {
|
||||
return limit;
|
||||
}
|
||||
|
||||
public void setLimit(int limit) {
|
||||
if (current >= 1 && limit <= 100) {
|
||||
this.limit = limit;
|
||||
}
|
||||
}
|
||||
|
||||
public int getRows() {
|
||||
return rows;
|
||||
}
|
||||
|
||||
public void setRows(int rows) {
|
||||
if (rows >= 0) {
|
||||
this.rows = rows;
|
||||
}
|
||||
}
|
||||
|
||||
public String getPath() {
|
||||
return path;
|
||||
}
|
||||
|
||||
public void setPath(String path) {
|
||||
this.path = path;
|
||||
}
|
||||
|
||||
/**
|
||||
* 获取当前页的起始索引 offset
|
||||
* @return
|
||||
*/
|
||||
public int getOffset() {
|
||||
return current * limit - limit;
|
||||
}
|
||||
|
||||
/**
|
||||
* 获取总页数
|
||||
* @return
|
||||
*/
|
||||
public int getTotal() {
|
||||
if (rows % limit == 0) {
|
||||
return rows / limit;
|
||||
}
|
||||
else {
|
||||
return rows / limit + 1;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* 获取分页栏起始页码
|
||||
* 分页栏显示当前页码及其前后两页
|
||||
* @return
|
||||
*/
|
||||
public int getFrom() {
|
||||
int from = current - 2;
|
||||
return from < 1 ? 1 : from;
|
||||
}
|
||||
|
||||
/**
|
||||
* 获取分页栏结束页码
|
||||
* @return
|
||||
*/
|
||||
public int getTo() {
|
||||
int to = current + 2;
|
||||
int total = getTotal();
|
||||
return to > total ? total : to;
|
||||
}
|
||||
}
|
||||
|
||||
```
|
||||
|
||||
## Controller
|
||||
|
||||
```java
|
||||
@Controller
|
||||
public class HomeController {
|
||||
|
||||
@Autowired
|
||||
private DiscussPostSerivce discussPostSerivce;
|
||||
|
||||
@Autowired
|
||||
private UserService userService;
|
||||
|
||||
@GetMapping("/index")
|
||||
public String getIndexPage(Model model, Page page) {
|
||||
// 获取总页数
|
||||
page.setRows(discussPostSerivce.findDiscussPostRows(0));
|
||||
page.setPath("/index");
|
||||
|
||||
// 分页查询
|
||||
List<DiscussPost> list = discussPostSerivce.findDiscussPosts(0, page.getOffset(), page.getLimit());
|
||||
// 封装帖子和该帖子对应的用户信息
|
||||
List<Map<String, Object>> discussPosts = new ArrayList<>();
|
||||
if (list != null) {
|
||||
for (DiscussPost post : list) {
|
||||
Map<String, Object> map = new HashMap<>();
|
||||
map.put("post", post);
|
||||
User user = userService.findUserById(post.getUserId());
|
||||
map.put("user", user);
|
||||
discussPosts.add(map);
|
||||
}
|
||||
}
|
||||
model.addAttribute("discussPosts", discussPosts);
|
||||
return "index";
|
||||
}
|
||||
|
||||
}
|
||||
```
|
||||
|
||||
🚩 小 Tip:这里不用把 Page 放入 model(`model.addAttribute("page", page);`
|
||||
|
||||
因为在方法调用之前,Spring MVC 会自动实例化 Model 和 Page,并将 Page 注入 Model,所以,在 Thymeleaf 中可以直接访问 Page 对象中的数据
|
||||
|
||||
## 前端界面 index.html
|
||||
|
||||
```html
|
||||
th:each="map:${discussPosts}"
|
||||
```
|
||||
|
||||
表示将每次遍历 `discussPosts` 取出的变量称为 `map`
|
||||
|
215
docs/100-发布帖子.md
215
docs/100-发布帖子.md
@ -1,215 +0,0 @@
|
||||
# 发布帖子
|
||||
|
||||
---
|
||||
|
||||
<img src="https://gitee.com/veal98/images/raw/master/img/20210122102856.png" style="zoom: 33%;" />
|
||||
|
||||
## Util
|
||||
|
||||
工具类:将服务端返回的消息封装成 JSON 格式的字符串
|
||||
|
||||
```java
|
||||
/**
|
||||
* 将服务端返回的消息封装成 JSON 格式的字符串
|
||||
* @param code 状态码
|
||||
* @param msg 提示消息
|
||||
* @param map 业务数据
|
||||
* @return 返回 JSON 格式字符串
|
||||
*/
|
||||
public static String getJSONString(int code, String msg, Map<String, Object> map) {
|
||||
JSONObject json = new JSONObject();
|
||||
json.put("code", code);
|
||||
json.put("msg", msg);
|
||||
if (map != null) {
|
||||
for (String key : map.keySet()) {
|
||||
json.put(key, map.get(key));
|
||||
}
|
||||
}
|
||||
return json.toJSONString();
|
||||
}
|
||||
|
||||
// 重载 getJSONString 方法,服务端方法可能不返回业务数据
|
||||
public static String getJSONString(int code, String msg) {
|
||||
return getJSONString(code, msg, null);
|
||||
}
|
||||
|
||||
// 重载 getJSONString 方法,服务端方法可能不返回业务数据和提示消息
|
||||
public static String getJSONString(int code) {
|
||||
return getJSONString(code, null, null);
|
||||
}
|
||||
|
||||
/**
|
||||
* 测试
|
||||
* @param args
|
||||
*/
|
||||
public static void main(String[] args) {
|
||||
Map<String, Object> map = new HashMap<>();
|
||||
map.put("name", "Jack");
|
||||
map.put("age", 18);
|
||||
// {"msg":"ok","code":0,"name":"Jack","age":18}
|
||||
System.out.println(getJSONString(0, "ok", map));
|
||||
}
|
||||
```
|
||||
|
||||
|
||||
|
||||
## DAO
|
||||
|
||||
`DiscussPostMapper`
|
||||
|
||||
```java
|
||||
/**
|
||||
* 插入/添加帖子
|
||||
* @param discussPost
|
||||
* @return
|
||||
*/
|
||||
int insertDiscussPost(DiscussPost discussPost);
|
||||
```
|
||||
|
||||
对应的 `mapper.xml`
|
||||
|
||||
```xml
|
||||
<!--插入/添加帖子-->
|
||||
<insert id="insertDiscussPost" parameterType="DiscussPost" keyProperty="id">
|
||||
insert into discuss_post (<include refid="insertFields"></include>)
|
||||
values(#{userId}, #{title}, #{content}, #{type}, #{status}, #{createTime}, #{commentCount}, #{score})
|
||||
</insert>
|
||||
```
|
||||
|
||||
## Service
|
||||
|
||||
```java
|
||||
/**
|
||||
* 添加帖子
|
||||
* @param discussPost
|
||||
* @return
|
||||
*/
|
||||
public int addDiscussPost(DiscussPost discussPost) {
|
||||
if (discussPost == null) {
|
||||
throw new IllegalArgumentException("参数不能为空");
|
||||
}
|
||||
|
||||
// 转义 HTML 标记,防止在 HTML 标签中注入攻击语句
|
||||
discussPost.setTitle(HtmlUtils.htmlEscape(discussPost.getTitle()));
|
||||
discussPost.setContent(HtmlUtils.htmlEscape(discussPost.getContent()));
|
||||
|
||||
// 过滤敏感词
|
||||
discussPost.setTitle(sensitiveFilter.filter(discussPost.getTitle()));
|
||||
discussPost.setContent(sensitiveFilter.filter(discussPost.getContent()));
|
||||
|
||||
return discussPostMapper.insertDiscussPost(discussPost);
|
||||
}
|
||||
```
|
||||
|
||||
转义 HTML 标记,防止在 HTML 标签中注入攻击语句,比如 `<script>alert('哈哈')</script>`
|
||||
|
||||
## Controller
|
||||
|
||||
```java
|
||||
@Controller
|
||||
@RequestMapping("/discuss")
|
||||
public class DiscussPostController {
|
||||
|
||||
@Autowired
|
||||
private DiscussPostSerivce discussPostSerivce;
|
||||
|
||||
@Autowired
|
||||
private HostHolder hostHolder;
|
||||
|
||||
/**
|
||||
* 添加帖子(发帖)
|
||||
* @param title
|
||||
* @param content
|
||||
* @return
|
||||
*/
|
||||
@PostMapping("/add")
|
||||
@ResponseBody
|
||||
public String addDiscussPost(String title, String content) {
|
||||
User user = hostHolder.getUser();
|
||||
if (user == null) {
|
||||
return CommunityUtil.getJSONString(403, "您还未登录");
|
||||
}
|
||||
|
||||
DiscussPost discussPost = new DiscussPost();
|
||||
discussPost.setUserId(user.getId());
|
||||
discussPost.setTitle(title);
|
||||
discussPost.setContent(content);
|
||||
discussPost.setCreateTime(new Date());
|
||||
|
||||
discussPostSerivce.addDiscussPost(discussPost);
|
||||
|
||||
// 报错的情况将来会统一处理
|
||||
return CommunityUtil.getJSONString(0, "发布成功");
|
||||
}
|
||||
|
||||
}
|
||||
```
|
||||
|
||||
|
||||
|
||||
## 前端
|
||||
|
||||
```java
|
||||
<button data-toggle="modal" data-target="#publishModal"
|
||||
th:if="${loginUser != null}">我要发布</button>
|
||||
|
||||
<!-- 弹出框 -->
|
||||
<div class="modal fade" id="publishModal">
|
||||
标题:<input id="recipient-name">
|
||||
正文:<textarea id="message-text"></textarea>
|
||||
<button id="publishBtn">发布</button>
|
||||
</div>
|
||||
|
||||
<!-- 提示框 -->
|
||||
<div class="modal fade" id="hintModal">
|
||||
......
|
||||
</div>
|
||||
```
|
||||
|
||||
对应的 js:
|
||||
|
||||
```js
|
||||
$(function(){
|
||||
$("#publishBtn").click(publish);
|
||||
});
|
||||
|
||||
function publish() {
|
||||
$("#publishModal").modal("hide");
|
||||
// 获取标题和内容
|
||||
var title = $("#recipient-name").val();
|
||||
var content = $("#message-text").val();
|
||||
// 发送异步请求
|
||||
$.post(
|
||||
CONTEXT_PATH + "/discuss/add",
|
||||
{"title": title, "content": content},
|
||||
// 处理服务端返回的数据
|
||||
function (data) {
|
||||
// String -> Json 对象
|
||||
data = $.parseJSON(data);
|
||||
// 在提示框 hintBody 显示服务端返回的消息
|
||||
$("#hintBody").text(data.msg);
|
||||
// 显示提示框
|
||||
$("#hintModal").modal("show");
|
||||
// 2s 后自动隐藏提示框
|
||||
setTimeout(function(){
|
||||
$("#hintModal").modal("hide");
|
||||
// 刷新页面
|
||||
if (data.code == 0) {
|
||||
window.location.reload();
|
||||
}
|
||||
}, 2000);
|
||||
|
||||
}
|
||||
)
|
||||
|
||||
}
|
||||
```
|
||||
|
||||
|
||||
|
||||
```js
|
||||
var title = $("#recipient-name").val();
|
||||
```
|
||||
|
||||
获取选择框 id = recipient-name 里面的值
|
||||
|
@ -1,41 +0,0 @@
|
||||
# 帖子详情页
|
||||
|
||||
---
|
||||
|
||||
## DAO
|
||||
|
||||
## Service
|
||||
|
||||
## Controller
|
||||
|
||||
```java
|
||||
/**
|
||||
* 进入帖子详情页
|
||||
* @param discussPostId
|
||||
* @param model
|
||||
* @return
|
||||
*/
|
||||
@GetMapping("/detail/{discussPostId}")
|
||||
public String getDiscussPost(@PathVariable("discussPostId") int discussPostId, Model model) {
|
||||
// 帖子
|
||||
DiscussPost discussPost = discussPostSerivce.findDiscussPostById(discussPostId);
|
||||
model.addAttribute("post", discussPost);
|
||||
// 作者
|
||||
User user = userService.findUserById(discussPost.getUserId());
|
||||
model.addAttribute("user", user);
|
||||
|
||||
return "/site/discuss-detail";
|
||||
|
||||
}
|
||||
```
|
||||
|
||||
## 前端
|
||||
|
||||
```html
|
||||
<span th:utext="${post.title}"></span>
|
||||
|
||||
<div th:utext="${user.username}"></div>
|
||||
|
||||
<div th:utext="${post.content}"></div>
|
||||
```
|
||||
|
193
docs/120-显示评论.md
193
docs/120-显示评论.md
@ -1,193 +0,0 @@
|
||||
# 显示评论
|
||||
|
||||
---
|
||||
|
||||
解释一下评论表中的三个字段:
|
||||
|
||||
- `entity_type`: 可以对帖子进行评论,也可以对该帖子的评论进行评论(回复) ,该字段就是用来表明评论目标的类别:1 帖子;2 评论
|
||||
- `entity_id`: 评论目标的 id。比如对 id 为 115 的 帖子进行评论,对 id 为 231 的评论进行评论
|
||||
- `target_id`: 指明我们这个回复是针对哪个用户的评论的(该字段只对回复生效,默认是 0)
|
||||
|
||||
## DAO
|
||||
|
||||
```java
|
||||
@Mapper
|
||||
public interface CommentMapper {
|
||||
|
||||
/**
|
||||
* 根据评论目标(类别、id)对评论进行分页查询
|
||||
* @param entityType 评论目标的类别
|
||||
* @param entityId 评论目标的 id
|
||||
* @param offset 每页的起始索引
|
||||
* @param limit 每页显示多少条数据
|
||||
* @return
|
||||
*/
|
||||
List<Comment> selectCommentByEntity(int entityType, int entityId, int offset, int limit);
|
||||
|
||||
/**
|
||||
* 查询评论的数量
|
||||
* @param entityType
|
||||
* @param entityId
|
||||
* @return
|
||||
*/
|
||||
int selectCountByEntity(int entityType, int entityId);
|
||||
|
||||
}
|
||||
```
|
||||
|
||||
对应的 `maaper.xml`
|
||||
|
||||
```xml
|
||||
<mapper namespace="com.greate.community.dao.CommentMapper">
|
||||
|
||||
<sql id = "selectFields">
|
||||
id, user_id, entity_type, entity_id, target_id, content, status, create_time
|
||||
</sql>
|
||||
|
||||
<sql id = "insertFields">
|
||||
user_id, entity_type, entity_id, target_id, content, status, create_time
|
||||
</sql>
|
||||
|
||||
<!--分页查询评论-->
|
||||
<!--不查询禁用的评论, 按照创建时间升序排序-->
|
||||
<select id = "selectCommentByEntity" resultType="Comment">
|
||||
select <include refid="selectFields"></include>
|
||||
from comment
|
||||
where status = 0
|
||||
and entity_type = #{entityType}
|
||||
and entity_id = #{entityId}
|
||||
order by create_time asc
|
||||
limit #{offset}, #{limit}
|
||||
</select>
|
||||
|
||||
<!--查询评论的个数-->
|
||||
<select id = "selectCountByEntity" resultType="int">
|
||||
select count(id)
|
||||
from comment
|
||||
where status = 0
|
||||
and entity_type = #{entityType}
|
||||
and entity_id = #{entityId}
|
||||
</select>
|
||||
|
||||
</mapper>
|
||||
```
|
||||
|
||||
## Service
|
||||
|
||||
```java
|
||||
@Service
|
||||
public class CommentService {
|
||||
|
||||
@Autowired
|
||||
private CommentMapper commentMapper;
|
||||
|
||||
/**
|
||||
* 根据评论目标(类别、id)对评论进行分页查询
|
||||
* @param entityType
|
||||
* @param entityId
|
||||
* @param offset
|
||||
* @param limit
|
||||
* @return
|
||||
*/
|
||||
public List<Comment> findCommentByEntity(int entityType, int entityId, int offset, int limit) {
|
||||
return commentMapper.selectCommentByEntity(entityType, entityId, offset, limit);
|
||||
}
|
||||
|
||||
|
||||
/**
|
||||
* 查询评论的数量
|
||||
* @param entityType
|
||||
* @param entityId
|
||||
* @return
|
||||
*/
|
||||
public int findCommentCount(int entityType, int entityId) {
|
||||
return commentMapper.selectCountByEntity(entityType, entityId);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
|
||||
|
||||
## Controller
|
||||
|
||||
在上篇的帖子详情中进行添加(复用分页组件)
|
||||
|
||||
```java
|
||||
/**
|
||||
* 进入帖子详情页
|
||||
* @param discussPostId
|
||||
* @param model
|
||||
* @return
|
||||
*/
|
||||
@GetMapping("/detail/{discussPostId}")
|
||||
public String getDiscussPost(@PathVariable("discussPostId") int discussPostId, Model model, Page page) {
|
||||
// 帖子
|
||||
DiscussPost discussPost = discussPostSerivce.findDiscussPostById(discussPostId);
|
||||
model.addAttribute("post", discussPost);
|
||||
// 作者
|
||||
User user = userService.findUserById(discussPost.getUserId());
|
||||
model.addAttribute("user", user);
|
||||
|
||||
// 评论分页信息
|
||||
page.setLimit(5);
|
||||
page.setPath("/discuss/detail/" + discussPostId);
|
||||
page.setRows(discussPost.getCommentCount());
|
||||
|
||||
// 存储帖子的评论
|
||||
List<Comment> commentList = commentService.findCommentByEntity(
|
||||
ENTITY_TYPE_POST, discussPost.getId(), page.getOffset(), page.getLimit());
|
||||
|
||||
List<Map<String, Object>> commentVoList = new ArrayList<>(); // 封装对帖子的评论和评论的作者信息
|
||||
if (commentList != null) {
|
||||
for (Comment comment : commentList) {
|
||||
// 对帖子的评论
|
||||
Map<String, Object> commentVo = new HashMap<>();
|
||||
commentVo.put("comment", comment);
|
||||
commentVo.put("user", userService.findUserById(comment.getUserId()));
|
||||
|
||||
// 存储评论的评论(不做分页)
|
||||
List<Comment> replyList = commentService.findCommentByEntity(
|
||||
ENTITY_TYPE_COMMENT, comment.getId(), 0, Integer.MAX_VALUE);
|
||||
List<Map<String, Object>> replyVoList = new ArrayList<>(); // 封装对评论的评论和评论的作者信息
|
||||
if (replyList != null) {
|
||||
// 对评论的评论
|
||||
for (Comment reply : replyList) {
|
||||
Map<String, Object> replyVo = new HashMap<>();
|
||||
replyVo.put("reply", reply);
|
||||
replyVo.put("user", userService.findUserById(reply.getUserId()));
|
||||
User target = reply.getTargetId() == 0 ? null : userService.findUserById(reply.getUserId());
|
||||
replyVo.put("target", target);
|
||||
|
||||
replyVoList.add(replyVo);
|
||||
}
|
||||
}
|
||||
commentVo.put("replys", replyVoList);
|
||||
|
||||
// 对某个评论的回复数量
|
||||
int replyCount = commentService.findCommentCount(ENTITY_TYPE_COMMENT, comment.getId());
|
||||
commentVo.put("replyCount", replyCount);
|
||||
|
||||
commentVoList.add(commentVo);
|
||||
}
|
||||
}
|
||||
|
||||
model.addAttribute("comments", commentVoList);
|
||||
|
||||
return "/site/discuss-detail";
|
||||
|
||||
}
|
||||
```
|
||||
|
||||
## 前端
|
||||
|
||||
```html
|
||||
<li th:each="cvo:${comments}">
|
||||
<div th:utext="${cvo.comment.content}"></div>
|
||||
|
||||
<li th:each="rvo:${cvo.replys}">
|
||||
<span th:text="${rvo.reply.content}"></span>
|
||||
```
|
||||
|
||||
`cvoStat` 固定表达:循环变量名 + Stat 表示每次的循环对象
|
||||
|
||||
`cvoStat.count` 表示当前是第几次循环
|
149
docs/130-添加评论.md
149
docs/130-添加评论.md
@ -1,149 +0,0 @@
|
||||
# 添加评论
|
||||
|
||||
---
|
||||
|
||||
事务管理(在添加评论失败的时候进行回滚)
|
||||
|
||||
- 声明式事务(推荐,简单)
|
||||
- 编程式事务
|
||||
|
||||
<img src="https://gitee.com/veal98/images/raw/master/img/20210123142151.png" style="zoom: 50%;" />
|
||||
|
||||
## DAO
|
||||
|
||||
`DiscussPostMapper`
|
||||
|
||||
```java
|
||||
/**
|
||||
* 修改帖子的评论数量
|
||||
* @param id 帖子 id
|
||||
* @param commentCount
|
||||
* @return
|
||||
*/
|
||||
int updateCommentCount(int id, int commentCount);
|
||||
```
|
||||
|
||||
对应的 xml
|
||||
|
||||
```xml
|
||||
<!--修改帖子的评论数量-->
|
||||
<update id="updateCommentCount">
|
||||
update discuss_post
|
||||
set comment_count = #{commentCount}
|
||||
where id = #{id}
|
||||
</update>
|
||||
```
|
||||
|
||||
`CommentMapper`
|
||||
|
||||
```java
|
||||
/**
|
||||
* 添加评论
|
||||
* @param comment
|
||||
* @return
|
||||
*/
|
||||
int insertComment(Comment comment);
|
||||
```
|
||||
|
||||
对应的 xml
|
||||
|
||||
```xml
|
||||
<!--添加评论-->
|
||||
<insert id = "insertComment" parameterType="Comment">
|
||||
insert into comment(<include refid="insertFields"></include>)
|
||||
values(#{userId}, #{entityType}, #{entityId}, #{targetId}, #{content}, #{status}, #{createTime})
|
||||
</insert>
|
||||
```
|
||||
|
||||
## Service
|
||||
|
||||
`DiscussPostService`
|
||||
|
||||
```java
|
||||
/**
|
||||
* 修改帖子的评论数量
|
||||
* @param id 帖子 id
|
||||
* @param commentCount
|
||||
* @return
|
||||
*/
|
||||
public int updateCommentCount(int id, int commentCount) {
|
||||
return discussPostMapper.updateCommentCount(id, commentCount);
|
||||
}
|
||||
```
|
||||
|
||||
`CommentService`
|
||||
|
||||
```java
|
||||
/**
|
||||
* 添加评论(需要事务管理)
|
||||
* @param comment
|
||||
* @return
|
||||
*/
|
||||
@Transactional(isolation = Isolation.READ_COMMITTED, propagation = Propagation.REQUIRED)
|
||||
public int addComment(Comment comment) {
|
||||
if (comment == null) {
|
||||
throw new IllegalArgumentException("参数不能为空");
|
||||
}
|
||||
|
||||
// Html 标签转义
|
||||
comment.setContent(HtmlUtils.htmlEscape(comment.getContent()));
|
||||
// 敏感词过滤
|
||||
comment.setContent(sensitiveFilter.filter(comment.getContent()));
|
||||
|
||||
// 添加评论
|
||||
int rows = commentMapper.insertComment(comment);
|
||||
|
||||
// 更新帖子的评论数量
|
||||
if (comment.getEntityType() == ENTITY_TYPE_POST) {
|
||||
int count = commentMapper.selectCountByEntity(comment.getEntityType(), comment.getEntityId());
|
||||
discussPostSerivce.updateCommentCount(comment.getEntityId(), count);
|
||||
}
|
||||
|
||||
return rows;
|
||||
}
|
||||
```
|
||||
|
||||
## Controller
|
||||
|
||||
**前端的 name = "xxx" 和 `public String addComment(Comment comment) {` Comment 实体类中的字段要一一对应**,这样即可直接从前端传值。
|
||||
|
||||
```java
|
||||
@Controller
|
||||
@RequestMapping("/comment")
|
||||
public class CommentController {
|
||||
|
||||
@Autowired
|
||||
private HostHolder hostHolder;
|
||||
|
||||
@Autowired
|
||||
private CommentService commentService;
|
||||
|
||||
/**
|
||||
* 添加评论
|
||||
* @param discussPostId
|
||||
* @param comment
|
||||
* @return
|
||||
*/
|
||||
@PostMapping("/add/{discussPostId}")
|
||||
public String addComment(@PathVariable("discussPostId") int discussPostId, Comment comment) {
|
||||
comment.setUserId(hostHolder.getUser().getId());
|
||||
comment.setStatus(0);
|
||||
comment.setCreateTime(new Date());
|
||||
commentService.addComment(comment);
|
||||
|
||||
return "redirect:/discuss/detail/" + discussPostId;
|
||||
}
|
||||
|
||||
}
|
||||
```
|
||||
|
||||
## 前端
|
||||
|
||||
```html
|
||||
<form method="post" th:action="@{|/comment/add/${post.id}|}">
|
||||
<input type="text" class="input-size" name="content" th:placeholder="|回复${rvo.user.username}|"/>
|
||||
<input type="hidden" name="entityType" value="2">
|
||||
<input type="hidden" name="entityId" th:value="${cvo.comment.id}">
|
||||
<input type="hidden" name="targetId" th:value="${rvo.user.id}">
|
||||
</form>
|
||||
```
|
355
docs/140-私信列表.md
355
docs/140-私信列表.md
@ -1,355 +0,0 @@
|
||||
# 私信列表
|
||||
|
||||
---
|
||||
|
||||
本节实现的功能:
|
||||
|
||||
- 私信列表:
|
||||
- 查询当前用户的会话列表
|
||||
- 每个会话只显示一条最新的私信
|
||||
- 支持分页显示
|
||||
- 私信详情
|
||||
- 查询某个会话所包含的私信
|
||||
- 支持分页显示
|
||||
- 访问私信详情时,将显示的私信设为已读状态
|
||||
|
||||
`message` 表中有个字段 `conservation_id` ,这个字段的设计方式是:比如用户 id 112 给 113 发消息,或者 113 给 112 发消息,这两个会话的 `conservation_id` 都是 `112_113`。当然,这个字段是冗余的,我们可以通过 from_id 和 to_id 推演出来,但是有了这个字段方便于后面的查询等操作
|
||||
|
||||
注意:`from_id = 1` 代表这是一个系统通知,后续会开发此功能
|
||||
|
||||
## DAO
|
||||
|
||||
```java
|
||||
@Mapper
|
||||
public interface MessageMapper {
|
||||
|
||||
/**
|
||||
* 查询当前用户的会话列表,针对每个会话只返回一条最新的私信
|
||||
* @param userId 用户 id
|
||||
* @param offset 每页的起始索引
|
||||
* @param limit 每页显示多少条数据
|
||||
* @return
|
||||
*/
|
||||
List<Message> selectConversations(int userId, int offset, int limit);
|
||||
|
||||
/**
|
||||
* 查询当前用户的会话数量
|
||||
* @param userId
|
||||
* @return
|
||||
*/
|
||||
int selectConversationCount(int userId);
|
||||
|
||||
/**
|
||||
* 查询某个会话所包含的私信列表
|
||||
* @param conversationId
|
||||
* @param offset
|
||||
* @param limit
|
||||
* @return
|
||||
*/
|
||||
List<Message> selectLetters(String conversationId, int offset, int limit);
|
||||
|
||||
/**
|
||||
* 查询某个会话所包含的私信数量
|
||||
* @param conversationId
|
||||
* @return
|
||||
*/
|
||||
int selectLetterCount(String conversationId);
|
||||
|
||||
/**
|
||||
* 查询未读私信的数量
|
||||
* @param userId
|
||||
* @param conversationId conversationId = null, 则查询该用户所有会话的未读私信数量
|
||||
* conversationId != null, 则查询该用户某个会话的未读私信数量
|
||||
* @return
|
||||
*/
|
||||
int selectLetterUnreadCount(int userId, String conversationId);
|
||||
|
||||
/**
|
||||
* 修改消息的状态
|
||||
* @param ids
|
||||
* @param status
|
||||
* @return
|
||||
*/
|
||||
int updateStatus(List<Integer> ids, int status);
|
||||
|
||||
}
|
||||
```
|
||||
|
||||
对应的 `mapper.xml`
|
||||
|
||||
```xml
|
||||
<mapper namespace="com.greate.community.dao.MessageMapper">
|
||||
|
||||
<sql id="selectFields">
|
||||
id, from_id, to_id, conversation_id, content, status, create_time
|
||||
</sql>
|
||||
|
||||
<!--查询当前用户的会话列表(针对每个会话只返回一条最新的私信)-->
|
||||
<select id="selectConversations" resultType="Message">
|
||||
select <include refid="selectFields"></include>
|
||||
from message
|
||||
where id in (
|
||||
select max(id)
|
||||
from message
|
||||
where status != 2
|
||||
and from_id != 1
|
||||
and (from_id = #{userId} or to_id = #{userId})
|
||||
group by conversation_id
|
||||
)
|
||||
order by id desc
|
||||
limit #{offset}, #{limit}
|
||||
</select>
|
||||
|
||||
<!--查询当前用户的会话数量-->
|
||||
<select id="selectConversationCount" resultType="int">
|
||||
select count(m.maxid) from (
|
||||
select max(id) as maxid
|
||||
from message
|
||||
where status != 2
|
||||
and from_id != 1
|
||||
and (from_id = #{userId} or to_id = #{toId})
|
||||
group by conversation_id
|
||||
) as m
|
||||
</select>
|
||||
|
||||
<!--查询某个会话所包含的私信列表-->
|
||||
<select id="selectLetters" resultType="Message">
|
||||
select <include refid="selectFields"></include>
|
||||
from message
|
||||
where status != 2
|
||||
and from_id != 1
|
||||
and conversation_id = #{conversationId}
|
||||
order by id desc
|
||||
limit #{offset}, #{limit}
|
||||
</select>
|
||||
|
||||
<!--查询某个会话所包含的私信数量-->
|
||||
<select id="selectLetterCount" resultType="int">
|
||||
select count(id)
|
||||
from message
|
||||
where status != 2
|
||||
and from_id != 1
|
||||
and conversation_id = #{conversationId}
|
||||
</select>
|
||||
|
||||
<!--查询未读私信的数量-->
|
||||
<select id="selectLetterUnreadCount" resultType="int">
|
||||
select count(id)
|
||||
from message
|
||||
where status = 0
|
||||
and from_id != 1
|
||||
and to_id = #{userId}
|
||||
<if test="conversationId != null">
|
||||
and conversation_id = #{conversationId}
|
||||
</if>
|
||||
</select>
|
||||
|
||||
<!--修改消息的状态-->
|
||||
<update id="updateStatus">
|
||||
update message
|
||||
set status = #{status}
|
||||
where id in
|
||||
<foreach collection="ids" item="id" open="(" separator="," close=")">
|
||||
#{id}
|
||||
</foreach>
|
||||
</update>
|
||||
|
||||
</mapper>
|
||||
```
|
||||
|
||||
|
||||
|
||||
## Service
|
||||
|
||||
```java
|
||||
@Service
|
||||
public class MessageService {
|
||||
|
||||
@Autowired
|
||||
private MessageMapper messageMapper;
|
||||
|
||||
// 查询当前用户的会话列表,针对每个会话只返回一条最新的私信
|
||||
public List<Message> findConversations(int userId, int offset, int limit) {
|
||||
return messageMapper.selectConversations(userId, offset, limit);
|
||||
}
|
||||
|
||||
// 查询当前用户的会话数量
|
||||
public int findConversationCout(int userId) {
|
||||
return messageMapper.selectConversationCount(userId);
|
||||
}
|
||||
|
||||
// 查询某个会话所包含的私信列表
|
||||
public List<Message> findLetters(String conversationId, int offset, int limit) {
|
||||
return messageMapper.selectLetters(conversationId, offset, limit);
|
||||
}
|
||||
|
||||
// 查询某个会话所包含的私信数量
|
||||
public int findLetterCount(String conversationId) {
|
||||
return messageMapper.selectLetterCount(conversationId);
|
||||
}
|
||||
|
||||
// 查询未读私信的数量
|
||||
public int findLetterUnreadCount(int userId, String conversationId) {
|
||||
return messageMapper.selectLetterUnreadCount(userId, conversationId);
|
||||
}
|
||||
|
||||
// 读取私信(将私信状态设置为已读)
|
||||
public int readMessage(List<Integer> ids) {
|
||||
return messageMapper.updateStatus(ids, 1);
|
||||
}
|
||||
|
||||
}
|
||||
```
|
||||
|
||||
|
||||
|
||||
## Controller
|
||||
|
||||
```java
|
||||
@Controller
|
||||
public class MessageController {
|
||||
|
||||
@Autowired
|
||||
private HostHolder hostHolder;
|
||||
|
||||
@Autowired
|
||||
private MessageService messageService;
|
||||
|
||||
@Autowired
|
||||
private UserService userService;
|
||||
|
||||
/**
|
||||
* 私信列表
|
||||
* @param model
|
||||
* @param page
|
||||
* @return
|
||||
*/
|
||||
@GetMapping("/letter/list")
|
||||
public String getLetterList(Model model, Page page) {
|
||||
User user = hostHolder.getUser();
|
||||
// 分页信息
|
||||
page.setLimit(5);
|
||||
page.setPath("/letter/list");
|
||||
page.setRows(messageService.findConversationCout(user.getId()));
|
||||
// 会话列表
|
||||
List<Message> conversationList = messageService.findConversations(
|
||||
user.getId(), page.getOffset(), page.getLimit());
|
||||
|
||||
List<Map<String, Object>> conversations = new ArrayList<>();
|
||||
if (conversationList != null) {
|
||||
for (Message message : conversationList) {
|
||||
Map<String, Object> map = new HashMap<>();
|
||||
map.put("conversation", message); // 私信
|
||||
map.put("letterCount", messageService.findLetterCount(
|
||||
message.getConversationId())); // 私信数量
|
||||
map.put("unreadCount", messageService.findLetterUnreadCount(
|
||||
user.getId(), message.getConversationId())); // 未读私信数量
|
||||
int targetId = user.getId() == message.getFromId() ? message.getToId() : message.getFromId();
|
||||
map.put("target", userService.findUserById(targetId)); // 私信对方
|
||||
|
||||
conversations.add(map);
|
||||
}
|
||||
}
|
||||
model.addAttribute("conversations", conversations);
|
||||
|
||||
// 查询当前用户的所有未读私信数量
|
||||
int letterUnreadCount = messageService.findLetterUnreadCount(user.getId(), null);
|
||||
model.addAttribute("letterUnreadCount", letterUnreadCount);
|
||||
|
||||
return "/site/letter";
|
||||
|
||||
}
|
||||
|
||||
/**
|
||||
* 私信详情页
|
||||
* @param conversationId
|
||||
* @param page
|
||||
* @param model
|
||||
* @return
|
||||
*/
|
||||
@GetMapping("/letter/detail/{conversationId}")
|
||||
public String getLetterDetail(@PathVariable("conversationId") String conversationId, Page page, Model model) {
|
||||
// 分页信息
|
||||
page.setLimit(5);
|
||||
page.setPath("/letter/detail/" + conversationId);
|
||||
page.setRows(messageService.findLetterCount(conversationId));
|
||||
|
||||
// 私信列表
|
||||
List<Message> letterList = messageService.findLetters(conversationId, page.getOffset(), page.getLimit());
|
||||
|
||||
List<Map<String, Object>> letters = new ArrayList<>();
|
||||
if (letterList != null) {
|
||||
for (Message message : letterList) {
|
||||
Map<String, Object> map = new HashMap<>();
|
||||
map.put("letter", message);
|
||||
map.put("fromUser", userService.findUserById(message.getFromId()));
|
||||
letters.add(map);
|
||||
}
|
||||
}
|
||||
model.addAttribute("letters", letters);
|
||||
|
||||
// 私信目标
|
||||
model.addAttribute("target", getLetterTarget(conversationId));
|
||||
|
||||
// 将私信列表中的未读消息改为已读
|
||||
List<Integer> ids = getUnreadLetterIds(letterList);
|
||||
if (!ids.isEmpty()) {
|
||||
messageService.readMessage(ids);
|
||||
}
|
||||
|
||||
return "/site/letter-detail";
|
||||
|
||||
}
|
||||
|
||||
/**
|
||||
* 获取私信对方对象
|
||||
* @param conversationId
|
||||
* @return
|
||||
*/
|
||||
private User getLetterTarget(String conversationId) {
|
||||
String[] ids = conversationId.split("_");
|
||||
int id0 = Integer.parseInt(ids[0]);
|
||||
int id1 = Integer.parseInt(ids[1]);
|
||||
|
||||
if (hostHolder.getUser().getId() == id0) {
|
||||
return userService.findUserById(id1);
|
||||
}
|
||||
else {
|
||||
return userService.findUserById(id0);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* 获取当前登录用户未读私信的 id
|
||||
* @param letterList
|
||||
* @return
|
||||
*/
|
||||
private List<Integer> getUnreadLetterIds(List<Message> letterList) {
|
||||
List<Integer> ids = new ArrayList<>();
|
||||
|
||||
if (letterList != null) {
|
||||
for (Message message : letterList) {
|
||||
// 当前用户是私信的接收者且该私信处于未读状态
|
||||
if (hostHolder.getUser().getId() == message.getToId() && message.getStatus() == 0) {
|
||||
ids.add(message.getId());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return ids;
|
||||
}
|
||||
|
||||
}
|
||||
```
|
||||
|
||||
## 前端
|
||||
|
||||
`letter-detail`
|
||||
|
||||
```html
|
||||
li th:each="map:${letters}">
|
||||
<img th:src="${map.fromUser.headerUrl}" alt="用户头像" >
|
||||
<strong th:utext="${map.fromUser.username}"></strong>
|
||||
<div th:utext="${map.letter.content}"></div>
|
||||
</li>
|
||||
```
|
135
docs/150-发送私信.md
135
docs/150-发送私信.md
@ -1,135 +0,0 @@
|
||||
# 发送私信
|
||||
|
||||
---
|
||||
|
||||
## DAO
|
||||
|
||||
```java
|
||||
/**
|
||||
* 新增一条私信
|
||||
* @param message
|
||||
* @return
|
||||
*/
|
||||
int insertMessage(Message message);
|
||||
```
|
||||
|
||||
|
||||
|
||||
```xml
|
||||
<!--新增一条私信-->
|
||||
<insert id="insertMessage" parameterType="Message" keyProperty="id">
|
||||
insert into message(<include refid="insertFields"></include>)
|
||||
values(#{fromId}, #{toId}, #{conversationId}, #{content}, #{status}, #{createTime})
|
||||
</insert>
|
||||
```
|
||||
|
||||
## Service
|
||||
|
||||
```java
|
||||
// 添加一条私信
|
||||
public int addMessage(Message message) {
|
||||
// 转义 HTML 标签
|
||||
message.setContent(HtmlUtils.htmlEscape(message.getContent()));
|
||||
// 过滤敏感词
|
||||
message.setContent(sensitiveFilter.filter(message.getContent()));
|
||||
|
||||
return messageMapper.insertMessage(message);
|
||||
}
|
||||
```
|
||||
|
||||
## Controller
|
||||
|
||||
```java
|
||||
/**
|
||||
* 发送私信
|
||||
* @param toName 收信人 username
|
||||
* @param content 内容
|
||||
* @return
|
||||
*/
|
||||
@PostMapping("/letter/send")
|
||||
@ResponseBody
|
||||
public String sendLetter(String toName, String content) {
|
||||
User target = userService.findUserByName(toName);
|
||||
if (target == null) {
|
||||
return CommunityUtil.getJSONString(1, "目标用户不存在");
|
||||
}
|
||||
|
||||
Message message = new Message();
|
||||
message.setFromId(hostHolder.getUser().getId());
|
||||
message.setToId(target.getId());
|
||||
if (message.getFromId() < message.getToId()) {
|
||||
message.setConversationId(message.getFromId() + "_" + message.getToId());
|
||||
}
|
||||
else {
|
||||
message.setConversationId(message.getToId() + "_" + message.getFromId());
|
||||
}
|
||||
message.setContent(content);
|
||||
message.setStatus(0); // 默认就是 0 未读,可不写
|
||||
message.setCreateTime(new Date());
|
||||
|
||||
messageService.addMessage(message);
|
||||
|
||||
return CommunityUtil.getJSONString(0);
|
||||
|
||||
}
|
||||
```
|
||||
|
||||
## 前端
|
||||
|
||||
```html
|
||||
<h5 id="exampleModalLabel">发私信</h5>
|
||||
<button type="button" class="close" data-dismiss="modal" aria-label="Close"></button>
|
||||
|
||||
<div class="modal-body">
|
||||
<form>
|
||||
<input id="recipient-name">
|
||||
<textarea id="message-text" ></textarea>
|
||||
</form>
|
||||
</div>
|
||||
|
||||
<div class="modal-footer">
|
||||
<button id="sendBtn">发送</button>
|
||||
</div>
|
||||
```
|
||||
|
||||
对应的 `letter.js` 文件:
|
||||
|
||||
```js
|
||||
$(function(){
|
||||
$("#sendBtn").click(send_letter);
|
||||
$(".close").click(delete_msg);
|
||||
});
|
||||
|
||||
function send_letter() {
|
||||
$("#sendModal").modal("hide");
|
||||
|
||||
var toName = $("#recipient-name").val();
|
||||
var content = $("#message-text").val();
|
||||
$.post(
|
||||
CONTEXT_PATH + "/letter/send",
|
||||
{"toName":toName, "content":content},
|
||||
function(data) {
|
||||
data = $.parseJSON(data);
|
||||
if (data.code == 0) {
|
||||
$("#hintBody").text("发送成功")
|
||||
}
|
||||
else {
|
||||
$("#hintBody").text(data.msg);
|
||||
}
|
||||
/*刷新界面*/
|
||||
$("#hintModal").modal("show");
|
||||
setTimeout(function(){
|
||||
$("#hintModal").modal("hide");
|
||||
location.reload()
|
||||
}, 2000);
|
||||
}
|
||||
)
|
||||
|
||||
|
||||
}
|
||||
|
||||
function delete_msg() {
|
||||
// TODO 删除数据
|
||||
$(this).parents(".media").remove();
|
||||
}
|
||||
```
|
@ -1,46 +0,0 @@
|
||||
# 统一处理异常
|
||||
|
||||
---
|
||||
|
||||
首先 SpringBoot 自动对我们的异常做了一个**表面处理**,我们只要遵守它的约定:
|
||||
|
||||
- 以错误码命名文件,放在 `/templates/error` 文件夹下即可
|
||||
|
||||
<img src="https://gitee.com/veal98/images/raw/master/img/20210125103620.png" style="zoom: 67%;" />
|
||||
|
||||
当然,这种处理仅仅是页面的跳转,对用户来说相对友好,但是对于开发者来说并没有啥用,500 出现的原因是服务端错误,我们需要对出错的具体原因进行一个统一的日志记录
|
||||
|
||||

|
||||
|
||||
```java
|
||||
/**
|
||||
* 处理服务端异常(500)
|
||||
*/
|
||||
@ControllerAdvice(annotations = Controller.class) // 扫描带有 @Controller 的组件
|
||||
public class ExceptionAdvice {
|
||||
|
||||
private static final Logger logger = LoggerFactory.getLogger(ExceptionAdvice.class);
|
||||
|
||||
@ExceptionHandler({Exception.class})
|
||||
public void handleException(Exception e, HttpServletRequest request, HttpServletResponse response) throws IOException {
|
||||
logger.error("服务器发生异常:" + e.getMessage());
|
||||
for (StackTraceElement element : e.getStackTrace()) {
|
||||
logger.error(element.toString());
|
||||
}
|
||||
// 区分异步请求和普通请求
|
||||
String xRequestedWith = request.getHeader("x-requested-with");
|
||||
if ("XMLHttpRequest".equals(xRequestedWith)) {
|
||||
// 异步请求(希望返回的是 JSON 数据)
|
||||
response.setContentType("application/plain;charset=utf-8");
|
||||
PrintWriter writer = response.getWriter();
|
||||
writer.write(CommunityUtil.getJSONString(1, "服务器异常"));
|
||||
}
|
||||
else {
|
||||
// 普通请求(希望返回的是一个网页)
|
||||
response.sendRedirect(request.getContextPath() + "/error");
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
```
|
||||
|
@ -1,35 +0,0 @@
|
||||
# 统一记录日志
|
||||
|
||||
---
|
||||
|
||||
AOP
|
||||
|
||||

|
||||
|
||||
```java
|
||||
@Component
|
||||
@Aspect
|
||||
public class ServiceLogAspect {
|
||||
|
||||
private static final Logger logger = LoggerFactory.getLogger(ServiceLogAspect.class);
|
||||
|
||||
@Pointcut("execution(* com.greate.community.service.*.*(..))")
|
||||
public void pointcut() {
|
||||
|
||||
}
|
||||
|
||||
@Before("pointcut()")
|
||||
public void before(JoinPoint joinPoint) {
|
||||
// 用户[IP 地址], 在某个时间访问了 [com.greate.community.service.xxx]
|
||||
ServletRequestAttributes attributes = (ServletRequestAttributes) RequestContextHolder.getRequestAttributes();
|
||||
HttpServletRequest request = attributes.getRequest();
|
||||
String ip = request.getRemoteHost();
|
||||
String time = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss").format(new Date());
|
||||
String target = joinPoint.getSignature().getDeclaringTypeName() + "." + joinPoint.getSignature().getName();
|
||||
logger.info(String.format("用户[%s], 在[%s], 访问了[%s].", ip, time, target));
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
```
|
||||
|
226
docs/180-点赞.md
226
docs/180-点赞.md
@ -1,226 +0,0 @@
|
||||
# 点赞
|
||||
|
||||
---
|
||||
|
||||
点赞:
|
||||
|
||||
- 支持对帖子、评论/回复点赞
|
||||
- 第 1 次点赞,第 2 次取消点赞
|
||||
- 首页统计帖子的点赞数量
|
||||
- 详情页统计帖子和评论/回复的点赞数量
|
||||
- 详情页显示用户的点赞状态(赞过了则显示已赞)
|
||||
|
||||
> Redis 一般不用 DAO 层,不像 MySQL
|
||||
|
||||
## Redis 配置
|
||||
|
||||
导包、配置端口等
|
||||
|
||||
```properties
|
||||
# Redis
|
||||
spring.redis.database = 11
|
||||
spring.redis.host = localhost
|
||||
spring.redis.port = 6379
|
||||
```
|
||||
|
||||
Redis 配置类
|
||||
|
||||
```java
|
||||
/**
|
||||
* Redis 配置类
|
||||
*/
|
||||
@Configuration
|
||||
public class RedisConfig {
|
||||
|
||||
@Bean
|
||||
public RedisTemplate<String, Object> redisTemplate(RedisConnectionFactory factory) {
|
||||
RedisTemplate<String, Object> template = new RedisTemplate<>();
|
||||
template.setConnectionFactory(factory);
|
||||
|
||||
// 设置 key 的序列化的方式
|
||||
template.setKeySerializer(RedisSerializer.string());
|
||||
// 设置 value 的序列化的方式
|
||||
template.setValueSerializer(RedisSerializer.json());
|
||||
// 设置 hash 的 key 的序列化的方式
|
||||
template.setHashKeySerializer(RedisSerializer.string());
|
||||
// 设置 hash 的 value 的序列化的方式
|
||||
template.setHashValueSerializer(RedisSerializer.json());
|
||||
|
||||
template.afterPropertiesSet();
|
||||
|
||||
return template;
|
||||
}
|
||||
|
||||
}
|
||||
```
|
||||
|
||||
动态生成 Redis 的 key:
|
||||
|
||||
我们将点赞相关信息存入 set 中。其中,key 命名为 `like:entity:entityType:entityId`,value 即存储点赞用户的 id。比如 key = `like:entity:2:246` value = `11` 表示用户 11 对实体类型 2 即评论进行了点赞,该评论的 id 是 246
|
||||
|
||||
```java
|
||||
/**
|
||||
* 生成 Redis 的 key
|
||||
*/
|
||||
public class RedisKeyUtil {
|
||||
|
||||
private static final String SPLIT = ":";
|
||||
private static final String PREFIX_ENTITY_LIKE = "like:entity";
|
||||
|
||||
// 某个实体(帖子、评论/回复)的赞
|
||||
// like:entity:entityType:entityId -> set(userId)
|
||||
// 谁给这个实体点了赞,就将这个用户的id存到这个实体对应的集合里
|
||||
public static String getEntityLikeKey(int entityType, int entityId) {
|
||||
return PREFIX_ENTITY_LIKE + SPLIT + entityType + SPLIT + entityId;
|
||||
}
|
||||
|
||||
}
|
||||
```
|
||||
|
||||
## Service
|
||||
|
||||
```java
|
||||
/**
|
||||
* 点赞相关
|
||||
*/
|
||||
@Service
|
||||
public class LikeService {
|
||||
|
||||
@Autowired
|
||||
private RedisTemplate redisTemplate;
|
||||
|
||||
/**
|
||||
* 点赞
|
||||
* @param userId
|
||||
* @param entityType
|
||||
* @param entityId
|
||||
*/
|
||||
public void like(int userId, int entityType, int entityId) {
|
||||
String entityLikeKey = RedisKeyUtil.getEntityLikeKey(entityType, entityId);
|
||||
// 判断用户是否已经点过赞了
|
||||
boolean isMember = redisTemplate.opsForSet().isMember(entityLikeKey, userId);
|
||||
if (isMember) {
|
||||
// 如果用户已经点过赞,点第二次则取消赞
|
||||
redisTemplate.opsForSet().remove(entityLikeKey, userId);
|
||||
}
|
||||
else {
|
||||
redisTemplate.opsForSet().add(entityLikeKey, userId);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* 查询某实体被点赞的数量
|
||||
* @param entityType
|
||||
* @param entityId
|
||||
* @return
|
||||
*/
|
||||
public long findEntityLikeCount(int entityType, int entityId) {
|
||||
String entityLikeKey = RedisKeyUtil.getEntityLikeKey(entityType, entityId);
|
||||
return redisTemplate.opsForSet().size(entityLikeKey);
|
||||
}
|
||||
|
||||
/**
|
||||
* 查询某个用户对某个实体的点赞状态(是否已赞)
|
||||
* @param userId
|
||||
* @param entityType
|
||||
* @param entityId
|
||||
* @return 1:已赞,0:未赞
|
||||
*/
|
||||
public int findEntityLikeStatus(int userId, int entityType, int entityId) {
|
||||
String entityLikeKey = RedisKeyUtil.getEntityLikeKey(entityType, entityId);
|
||||
return redisTemplate.opsForSet().isMember(entityLikeKey, userId) ? 1 : 0;
|
||||
}
|
||||
|
||||
}
|
||||
```
|
||||
|
||||
## 表现层(Controller 和前端)
|
||||
|
||||
在首页进行查询的时候,添加对帖子点赞数量查询
|
||||
|
||||
`HomeController`
|
||||
|
||||
```java
|
||||
long likeCount = likeService.findEntityLikeCount(ENTITY_TYPE_POST, post.getId());
|
||||
map.put("likeCount", likeCount);
|
||||
```
|
||||
|
||||
在消息列表进行查询的时候,添加点赞功能,并对帖子、评论的点赞数量以及目前登录用户的点赞状态进行查询
|
||||
|
||||
`LikeController`
|
||||
|
||||
```java
|
||||
/**
|
||||
* 点赞
|
||||
*/
|
||||
@Controller
|
||||
public class LikeController {
|
||||
|
||||
@Autowired
|
||||
private HostHolder hostHolder;
|
||||
|
||||
@Autowired
|
||||
private LikeService likeService;
|
||||
|
||||
@PostMapping("/like")
|
||||
@ResponseBody
|
||||
public String like(int entityType, int entityId) {
|
||||
User user = hostHolder.getUser();
|
||||
// 点赞
|
||||
likeService.like(user.getId(), entityType, entityId);
|
||||
// 点赞数量
|
||||
long likeCount = likeService.findEntityLikeCount(entityType, entityId);
|
||||
// 点赞状态
|
||||
int likeStatus = likeService.findEntityLikeStatus(user.getId(), entityType, entityId);
|
||||
|
||||
Map<String, Object> map = new HashMap<>();
|
||||
map.put("likeCount", likeCount);
|
||||
map.put("likeStatus", likeStatus);
|
||||
|
||||
return CommunityUtil.getJSONString(0, null, map);
|
||||
}
|
||||
|
||||
}
|
||||
```
|
||||
|
||||
前端消息列表界面 `discuss-detail.html`:
|
||||
|
||||
```html
|
||||
<a href="javascript:;" th:onclick="|like(this, 1, ${post.id});|" class="text-primary">
|
||||
<b th:text="${likeStatus == 1 ? '已赞' : '赞'}"></b> <i th:text="${likeCount}"></i>
|
||||
</a>
|
||||
```
|
||||
|
||||
对应的 `discuss.js`
|
||||
|
||||
```java
|
||||
function like(btn, entityType, entityId) {
|
||||
$.post(
|
||||
CONTEXT_PATH + "/like",
|
||||
{"entityType":entityType, "entityId":entityId},
|
||||
function(data) {
|
||||
data = $.parseJSON(data);
|
||||
if (data.code == 0) {
|
||||
$(btn).children("i").text(data.likeCount);
|
||||
$(btn).children("b").text(data.likeStatus == 1 ? '已赞' : '赞');
|
||||
}
|
||||
else {
|
||||
alert(data.msg);
|
||||
}
|
||||
}
|
||||
)
|
||||
}
|
||||
```
|
||||
|
||||
对应的 `DiscussPostController` :
|
||||
|
||||
```java
|
||||
// 点赞数量
|
||||
long likeCount = likeService.findEntityLikeCount(ENTITY_TYPE_POST, discussPostId);
|
||||
model.addAttribute("likeCount", likeCount);
|
||||
// 当前登录用户的点赞状态
|
||||
int likeStatus = hostHolder.getUser() == null ? 0 :
|
||||
likeService.findEntityLikeStatus(hostHolder.getUser().getId(), ENTITY_TYPE_POST, discussPostId);
|
||||
model.addAttribute("likeStatus", likeStatus);
|
||||
```
|
||||
|
@ -1,169 +0,0 @@
|
||||
# 我收到的赞
|
||||
|
||||
---
|
||||
|
||||
- 重构点赞功能(使用 Redis 进行事务管理)
|
||||
|
||||
- 以用户为 key,查询点赞数量
|
||||
- increment(key) / decrement(key)
|
||||
|
||||
> 注意 Redis 的事务管理中不要写查询语句,因为 Redis 会把事务存放在队列中,只有当事务提交之后才会进行统一的提交。
|
||||
|
||||
- 开发个人主页
|
||||
|
||||
## 工具类
|
||||
|
||||
在生成 Redis 的 key 工具类中添加一个被赞用户的 key
|
||||
|
||||
```java
|
||||
/**
|
||||
* 生成 Redis 的 key
|
||||
*/
|
||||
public class RedisKeyUtil {
|
||||
|
||||
private static final String SPLIT = ":";
|
||||
private static final String PREFIX_ENTITY_LIKE = "like:entity";
|
||||
private static final String PREFIX_USER_LIKE = "like:user";
|
||||
|
||||
// 某个实体(帖子、评论/回复)的赞
|
||||
// like:entity:entityType:entityId -> set(userId)
|
||||
// 谁给这个实体点了赞,就将这个用户的id存到这个实体对应的集合里
|
||||
public static String getEntityLikeKey(int entityType, int entityId) {
|
||||
return PREFIX_ENTITY_LIKE + SPLIT + entityType + SPLIT + entityId;
|
||||
}
|
||||
|
||||
// 某个用户的赞(被赞)
|
||||
// like:user:userId -> int
|
||||
public static String getUserLikeKey(int userId) {
|
||||
return PREFIX_USER_LIKE + SPLIT + userId;
|
||||
}
|
||||
|
||||
}
|
||||
```
|
||||
|
||||
## Service
|
||||
|
||||
重构点赞方法,将被赞的帖子/评论的作者信息添加进去
|
||||
|
||||
```java
|
||||
/**
|
||||
* 点赞
|
||||
* @param userId 点赞的用户 id
|
||||
* @param entityType
|
||||
* @param entityId
|
||||
* @param entityUserId 被赞的帖子/评论的作者 id
|
||||
*/
|
||||
public void like(int userId, int entityType, int entityId, int entityUserId) {
|
||||
redisTemplate.execute(new SessionCallback() {
|
||||
@Override
|
||||
public Object execute(RedisOperations redisOperations) throws DataAccessException {
|
||||
String entityLikeKey = RedisKeyUtil.getEntityLikeKey(entityType, entityId);
|
||||
String userLikeKey = RedisKeyUtil.getUserLikeKey(entityUserId);
|
||||
|
||||
// 判断用户是否已经点过赞了
|
||||
boolean isMember = redisOperations.opsForSet().isMember(entityLikeKey, userId);
|
||||
|
||||
redisOperations.multi(); // 开启事务
|
||||
|
||||
if (isMember) {
|
||||
// 如果用户已经点过赞,点第二次则取消赞
|
||||
redisOperations.opsForSet().remove(entityLikeKey, userId);
|
||||
redisOperations.opsForValue().decrement(userLikeKey);
|
||||
}
|
||||
else {
|
||||
redisTemplate.opsForSet().add(entityLikeKey, userId);
|
||||
redisOperations.opsForValue().increment(userLikeKey);
|
||||
}
|
||||
|
||||
return redisOperations.exec(); // 提交事务
|
||||
}
|
||||
});
|
||||
}
|
||||
```
|
||||
|
||||
|
||||
|
||||
```java
|
||||
/**
|
||||
* 查询某个用户获得赞数量
|
||||
* @param userId
|
||||
* @return
|
||||
*/
|
||||
public int findUserLikeCount(int userId) {
|
||||
String userLikeKey = RedisKeyUtil.getUserLikeKey(userId);
|
||||
Integer count = (Integer) redisTemplate.opsForValue().get(userLikeKey);
|
||||
return count == null ? 0 : count;
|
||||
}
|
||||
```
|
||||
|
||||
## Controller
|
||||
|
||||
添加一个被赞帖子/评论的作者 id 即可:
|
||||
|
||||
```java
|
||||
/**
|
||||
* 点赞
|
||||
* @param entityType
|
||||
* @param entityId
|
||||
* @param entityUserId 赞的帖子/评论的作者 id
|
||||
* @return
|
||||
*/
|
||||
@PostMapping("/like")
|
||||
@ResponseBody
|
||||
public String like(int entityType, int entityId, int entityUserId) {
|
||||
User user = hostHolder.getUser();
|
||||
// 点赞
|
||||
likeService.like(user.getId(), entityType, entityId, entityUserId);
|
||||
// 点赞数量
|
||||
long likeCount = likeService.findEntityLikeCount(entityType, entityId);
|
||||
// 点赞状态
|
||||
int likeStatus = likeService.findEntityLikeStatus(user.getId(), entityType, entityId);
|
||||
|
||||
Map<String, Object> map = new HashMap<>();
|
||||
map.put("likeCount", likeCount);
|
||||
map.put("likeStatus", likeStatus);
|
||||
|
||||
return CommunityUtil.getJSONString(0, null, map);
|
||||
}
|
||||
```
|
||||
|
||||
## 前端
|
||||
|
||||
添加一个被赞帖子/评论的作者 id 即可
|
||||
|
||||
```java
|
||||
<a href="javascript:;" th:onclick="|like(this, 1, ${post.id}, ${post.userId});|" class="text-primary">
|
||||
<b th:text="${likeStatus == 1 ? '已赞' : '赞'}"></b> <i th:text="${likeCount}"></i>
|
||||
</a>
|
||||
```
|
||||
|
||||
修改对应的 `discuss.js`
|
||||
|
||||
```js
|
||||
function like(btn, entityType, entityId, entityUserId) {
|
||||
$.post(
|
||||
CONTEXT_PATH + "/like",
|
||||
{"entityType":entityType, "entityId":entityId, "entityUserId":entityUserId},
|
||||
function(data) {
|
||||
data = $.parseJSON(data);
|
||||
if (data.code == 0) {
|
||||
$(btn).children("i").text(data.likeCount);
|
||||
$(btn).children("b").text(data.likeStatus == 1 ? '已赞' : '赞');
|
||||
}
|
||||
else {
|
||||
alert(data.msg);
|
||||
}
|
||||
|
||||
}
|
||||
)
|
||||
}
|
||||
```
|
||||
|
||||
修改前端用户头像的链接
|
||||
|
||||
```html
|
||||
<a th:href="@{|/user/profile/${map.user.id}|}">
|
||||
<img th:src="${map.user.headerUrl}" class="mr-4 rounded-circle" alt="用户头像" style="width:50px;height:50px;">
|
||||
</a>
|
||||
```
|
||||
|
@ -1,35 +0,0 @@
|
||||
# 发送邮件功能
|
||||
|
||||
---
|
||||
|
||||
> 该功能用于注册模块
|
||||
|
||||
邮箱设置:
|
||||
|
||||
- 启用客户端 SMTP 服务
|
||||
|
||||

|
||||
|
||||
Spring Email:
|
||||
|
||||
- 导入 jar 包
|
||||
|
||||
- 邮箱参数配置
|
||||
|
||||
```properties
|
||||
# Spring Mail
|
||||
spring.mail.host = smtp.sina.com
|
||||
spring.mail.port = 465
|
||||
# 邮箱用户名
|
||||
spring.mail.username = xxx
|
||||
# 授权码(不是密码)
|
||||
spring.mail.password = xxx
|
||||
spring.mail.protocol = smtps
|
||||
spring.mail.properties.mail.smtp.ssl.enable = true
|
||||
```
|
||||
|
||||
- 使用 JavaMailSender 发送普通文字邮件
|
||||
|
||||
模板引擎:
|
||||
|
||||
- 使用 Thymeleaf 发送 HTML 邮件
|
297
docs/200-关注.md
297
docs/200-关注.md
@ -1,297 +0,0 @@
|
||||
# 关注
|
||||
|
||||
---
|
||||
|
||||
需求:
|
||||
|
||||
- 开发关注、取消关注功能
|
||||
- 统计用户的关注数、粉丝数
|
||||
|
||||
关键:
|
||||
|
||||
- 若 A 关注了 B,则 A 是 B 的粉丝 Follower,B 是 A 的目标 Followee
|
||||
- 关注的目标可以是用户、帖子、题目等,在实现时将这些目标抽象为实体
|
||||
|
||||
|
||||
|
||||
## 工具类:生成 Redis 的 Key
|
||||
|
||||
```java
|
||||
/**
|
||||
* 生成 Redis 的 key
|
||||
*/
|
||||
public class RedisKeyUtil {
|
||||
|
||||
private static final String SPLIT = ":";
|
||||
|
||||
private static final String PREFIX_FOLLOWER = "follower"; // 被关注(粉丝)
|
||||
private static final String PREFIX_FOLLOWEE = "followee"; // 关注的目标
|
||||
|
||||
/**
|
||||
* 某个用户关注的实体
|
||||
* followee:userId:entityType -> zset(entityId, now) 以当前关注的时间进行排序
|
||||
* @param userId 粉丝的 id
|
||||
* @param entityType 关注的实体类型
|
||||
* @return
|
||||
*/
|
||||
public static String getFolloweeKey(int userId, int entityType) {
|
||||
return PREFIX_FOLLOWEE + SPLIT + userId + SPLIT + entityType;
|
||||
}
|
||||
|
||||
/**
|
||||
* 某个实体拥有的粉丝
|
||||
* follower:entityType:entityId -> zset(userId, now)
|
||||
* @param entityType
|
||||
* @param entityId
|
||||
* @return
|
||||
*/
|
||||
public static String getFollowerKey(int entityType, int entityId) {
|
||||
return PREFIX_FOLLOWER + SPLIT + entityType + SPLIT + entityId;
|
||||
}
|
||||
|
||||
}
|
||||
```
|
||||
|
||||
## Service
|
||||
|
||||
```java
|
||||
/**
|
||||
* 关注相关
|
||||
*/
|
||||
@Service
|
||||
public class FollowService {
|
||||
|
||||
@Autowired
|
||||
private RedisTemplate redisTemplate;
|
||||
|
||||
/**
|
||||
* 关注
|
||||
* @param userId
|
||||
* @param entityType
|
||||
* @param entityId
|
||||
*/
|
||||
public void follow(int userId, int entityType, int entityId) {
|
||||
redisTemplate.execute(new SessionCallback() {
|
||||
@Override
|
||||
public Object execute(RedisOperations redisOperations) throws DataAccessException {
|
||||
// 生成 Redis 的 key
|
||||
String followeeKey = RedisKeyUtil.getFolloweeKey(userId, entityType);
|
||||
String followerKey = RedisKeyUtil.getFollowerKey(entityType, entityId);
|
||||
|
||||
// 开启事务管理
|
||||
redisOperations.multi();
|
||||
|
||||
// 插入数据
|
||||
redisOperations.opsForZSet().add(followeeKey, entityId, System.currentTimeMillis());
|
||||
redisOperations.opsForZSet().add(followerKey, userId, System.currentTimeMillis());
|
||||
|
||||
// 提交事务
|
||||
return redisOperations.exec();
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* 取消关注
|
||||
* @param userId
|
||||
* @param entityType
|
||||
* @param entityId
|
||||
*/
|
||||
public void unfollow(int userId, int entityType, int entityId) {
|
||||
redisTemplate.execute(new SessionCallback() {
|
||||
@Override
|
||||
public Object execute(RedisOperations redisOperations) throws DataAccessException {
|
||||
// 生成 Redis 的 key
|
||||
String followeeKey = RedisKeyUtil.getFolloweeKey(userId, entityType);
|
||||
String followerKey = RedisKeyUtil.getFollowerKey(entityType, entityId);
|
||||
|
||||
// 开启事务管理
|
||||
redisOperations.multi();
|
||||
|
||||
// 删除数据
|
||||
redisOperations.opsForZSet().remove(followeeKey, entityId);
|
||||
redisOperations.opsForZSet().remove(followerKey, userId);
|
||||
|
||||
// 提交事务
|
||||
return redisOperations.exec();
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* 查询某个用户关注的实体的数量
|
||||
* @param userId 用户 id
|
||||
* @param entityType 实体类型
|
||||
* @return
|
||||
*/
|
||||
public long findFolloweeCount(int userId, int entityType) {
|
||||
String followeeKey = RedisKeyUtil.getFolloweeKey(userId, entityType);
|
||||
return redisTemplate.opsForZSet().zCard(followeeKey);
|
||||
}
|
||||
|
||||
/**
|
||||
* 查询某个实体的粉丝数量
|
||||
* @param entityType
|
||||
* @param entityId
|
||||
* @return
|
||||
*/
|
||||
public long findFollowerCount(int entityType, int entityId) {
|
||||
String followerKey = RedisKeyUtil.getFollowerKey(entityType, entityId);
|
||||
return redisTemplate.opsForZSet().zCard(followerKey);
|
||||
}
|
||||
|
||||
/**
|
||||
* 判断当前用户是否已关注该实体
|
||||
* @param userId
|
||||
* @param entityType
|
||||
* @param entityId
|
||||
* @return
|
||||
*/
|
||||
public boolean hasFollowed(int userId, int entityType, int entityId) {
|
||||
String followeeKey = RedisKeyUtil.getFolloweeKey(userId, entityType);
|
||||
return redisTemplate.opsForZSet().score(followeeKey, entityId) != null ;
|
||||
}
|
||||
|
||||
}
|
||||
```
|
||||
|
||||
## Controller
|
||||
|
||||
```java
|
||||
/**
|
||||
* 关注
|
||||
*/
|
||||
@Controller
|
||||
public class FollowController {
|
||||
|
||||
@Autowired
|
||||
private FollowService followService;
|
||||
|
||||
@Autowired
|
||||
private HostHolder hostHolder;
|
||||
|
||||
/**
|
||||
* 关注
|
||||
* @param entityType
|
||||
* @param entityId
|
||||
* @return
|
||||
*/
|
||||
@PostMapping("/follow")
|
||||
@ResponseBody
|
||||
public String follow(int entityType, int entityId) {
|
||||
User user = hostHolder.getUser();
|
||||
|
||||
followService.follow(user.getId(), entityType, entityId);
|
||||
|
||||
return CommunityUtil.getJSONString(0, "已关注");
|
||||
}
|
||||
|
||||
/**
|
||||
* 取消关注
|
||||
* @param entityType
|
||||
* @param entityId
|
||||
* @return
|
||||
*/
|
||||
@PostMapping("/unfollow")
|
||||
@ResponseBody
|
||||
public String unfollow(int entityType, int entityId) {
|
||||
User user = hostHolder.getUser();
|
||||
|
||||
followService.unfollow(user.getId(), entityType, entityId);
|
||||
|
||||
return CommunityUtil.getJSONString(0, "已取消关注");
|
||||
}
|
||||
|
||||
}
|
||||
```
|
||||
|
||||
在 `UserController` 中添加进入个人主页查询关注/粉丝数量的逻辑:
|
||||
|
||||
```java
|
||||
// 关注数量
|
||||
long followeeCount = followService.findFolloweeCount(userId, ENTITY_TYPE_USER);
|
||||
model.addAttribute("followeeCount", followeeCount);
|
||||
// 粉丝数量
|
||||
long followerCount = followService.findFollowerCount(ENTITY_TYPE_USER, userId);
|
||||
model.addAttribute("followerCount", followerCount);
|
||||
// 当前登录用户是否已关注该用户
|
||||
boolean hasFollowed = false;
|
||||
if (hostHolder.getUser() != null) {
|
||||
hasFollowed = followService.hasFollowed(hostHolder.getUser().getId(), ENTITY_TYPE_USER, userId);
|
||||
}
|
||||
model.addAttribute("hasFollowed", hasFollowed);
|
||||
```
|
||||
|
||||
## 前端
|
||||
|
||||
```html
|
||||
<div class="media-body">
|
||||
<h5 class="mt-0 text-warning">
|
||||
<span th:utext="${user.username}"></span>
|
||||
<input type="hidden" id="entityId" th:value="${user.id}">
|
||||
<button type="button" th:class="|btn ${hasFollowed ? 'btn-secondary' : 'btn-info'} btn-sm float-right mr-5 follow-btn|"
|
||||
th:text="${hasFollowed ? '已关注' : '关注TA'}"
|
||||
th:if="${loginUser!=null && loginUser.id!=user.id}"></button>
|
||||
</h5>
|
||||
|
||||
<div class="text-muted mt-3 mb-5">
|
||||
<span>关注了 <a class="text-primary" href="followee.html" th:text="${followeeCount}"></a> 人</span>
|
||||
<span class="ml-4">关注者 <a class="text-primary" href="follower.html" th:text="${followerCount}"></a> 人</span>
|
||||
</div>
|
||||
</div>
|
||||
```
|
||||
|
||||
对应的 `profile.js`
|
||||
|
||||
```js
|
||||
$(function(){
|
||||
$(".follow-btn").click(follow);
|
||||
});
|
||||
|
||||
function follow() {
|
||||
var btn = this;
|
||||
if($(btn).hasClass("btn-info")) {
|
||||
// 关注TA
|
||||
$.post(
|
||||
CONTEXT_PATH + "/follow",
|
||||
{"entityType":3, "entityId":$(btn).prev().val()},
|
||||
function (data) {
|
||||
data = $.parseJSON(data);
|
||||
if (data.code == 0) {
|
||||
// 偷个懒,直接刷新界面
|
||||
window.location.reload();
|
||||
}
|
||||
else {
|
||||
alert(data.msg);
|
||||
}
|
||||
}
|
||||
)
|
||||
} else {
|
||||
// 取消关注
|
||||
$.post(
|
||||
CONTEXT_PATH + "/unfollow",
|
||||
{"entityType":3, "entityId":$(btn).prev().val()},
|
||||
function (data) {
|
||||
data = $.parseJSON(data);
|
||||
if (data.code == 0) {
|
||||
// 偷个懒,直接刷新界面
|
||||
window.location.reload();
|
||||
}
|
||||
else {
|
||||
alert(data.msg);
|
||||
}
|
||||
}
|
||||
)
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
注意:
|
||||
|
||||
```
|
||||
$.post(
|
||||
CONTEXT_PATH + "/follow",
|
||||
{"entityType":3, "entityId":$(btn).prev().val()},
|
||||
```
|
||||
|
||||
中的 `"entityType"` 和 `“entityId”` 名称对应后端 `/follow` 方法的参数,其值需要从前端获取
|
@ -1,192 +0,0 @@
|
||||
# 关注列表、粉丝列表
|
||||
|
||||
---
|
||||
|
||||
业务层:
|
||||
|
||||
- 查询某个用户关注的人,支持分页
|
||||
- 查询某个用户的粉丝,支持分页
|
||||
|
||||
表现层:
|
||||
|
||||
- 处理 “查询关注的人”、“查询粉丝” 的请求
|
||||
- 编写 “查询关注的人”、“查询粉丝” 模板
|
||||
|
||||
## Service
|
||||
|
||||
```java
|
||||
/**
|
||||
* 分页查询某个用户关注的人(偷个懒,此处没有做对其他实体的关注)
|
||||
* @param userId
|
||||
* @param offset
|
||||
* @param limit
|
||||
* @return
|
||||
*/
|
||||
public List<Map<String, Object>> findFollowees(int userId, int offset, int limit) {
|
||||
String followeeKey = RedisKeyUtil.getFolloweeKey(userId, ENTITY_TYPE_USER);
|
||||
Set<Integer> targetIds = redisTemplate.opsForZSet().reverseRange(followeeKey, offset, offset + limit - 1);
|
||||
if (targetIds == null) {
|
||||
return null;
|
||||
}
|
||||
List<Map<String, Object>> list = new ArrayList<>();
|
||||
for (Integer targetId : targetIds) {
|
||||
Map<String, Object> map = new HashMap<>();
|
||||
|
||||
User user = userService.findUserById(targetId);
|
||||
map.put("user", user);
|
||||
Double score = redisTemplate.opsForZSet().score(followeeKey, targetId);
|
||||
map.put("followTime", new Date(score.longValue()));
|
||||
|
||||
list.add(map);
|
||||
}
|
||||
|
||||
return list;
|
||||
}
|
||||
|
||||
/**
|
||||
* 分页查询某个用户的粉丝(偷个懒,此处没有做对其他实体的粉丝)
|
||||
* @param userId
|
||||
* @param offset
|
||||
* @param limit
|
||||
* @return
|
||||
*/
|
||||
public List<Map<String, Object>> findFollowers(int userId, int offset, int limit) {
|
||||
String followerKey = RedisKeyUtil.getFollowerKey(ENTITY_TYPE_USER, userId);
|
||||
Set<Integer> targetIds = redisTemplate.opsForZSet().reverseRange(followerKey, offset, offset + limit - 1);
|
||||
if (targetIds == null) {
|
||||
return null;
|
||||
}
|
||||
List<Map<String, Object>> list = new ArrayList<>();
|
||||
for (Integer targetId : targetIds) {
|
||||
Map<String, Object> map = new HashMap<>();
|
||||
|
||||
User user = userService.findUserById(targetId);
|
||||
map.put("user", user);
|
||||
Double score = redisTemplate.opsForZSet().score(followerKey, targetId);
|
||||
map.put("followTime", new Date(score.longValue()));
|
||||
|
||||
list.add(map);
|
||||
}
|
||||
|
||||
return list;
|
||||
|
||||
}
|
||||
```
|
||||
|
||||
## Controller
|
||||
|
||||
```java
|
||||
/**
|
||||
* 某个用户的关注列表(人)
|
||||
* @param userId
|
||||
* @param page
|
||||
* @param model
|
||||
* @return
|
||||
*/
|
||||
@GetMapping("/followees/{userId}")
|
||||
public String getFollowees(@PathVariable("userId") int userId, Page page, Model model) {
|
||||
User user = userService.findUserById(userId);
|
||||
if (user == null) {
|
||||
throw new RuntimeException("该用户不存在");
|
||||
}
|
||||
model.addAttribute("user", user);
|
||||
|
||||
page.setLimit(5);
|
||||
page.setPath("/followees/" + userId);
|
||||
page.setRows((int) followService.findFolloweeCount(userId, ENTITY_TYPE_USER));
|
||||
|
||||
// 获取关注列表
|
||||
List<Map<String, Object>> userList = followService.findFollowees(userId, page.getOffset(), page.getLimit());
|
||||
|
||||
if (userList != null) {
|
||||
for (Map<String, Object> map : userList) {
|
||||
User u = (User) map.get("user"); // 被关注的用户
|
||||
map.put("hasFollowed", hasFollowed(u.getId())); // 判断当前登录用户是否已关注这个关注列表中的某个用户
|
||||
}
|
||||
}
|
||||
|
||||
model.addAttribute("users", userList);
|
||||
|
||||
return "/site/followee";
|
||||
}
|
||||
|
||||
/**
|
||||
* 某个用户的粉丝列表
|
||||
* @param userId
|
||||
* @param page
|
||||
* @param model
|
||||
* @return
|
||||
*/
|
||||
@GetMapping("/followers/{userId}")
|
||||
public String getFollowers(@PathVariable("userId") int userId, Page page, Model model) {
|
||||
User user = userService.findUserById(userId);
|
||||
if (user == null) {
|
||||
throw new RuntimeException("该用户不存在");
|
||||
}
|
||||
model.addAttribute("user", user);
|
||||
|
||||
page.setLimit(5);
|
||||
page.setPath("/followers/" + userId);
|
||||
page.setRows((int) followService.findFollowerCount(ENTITY_TYPE_USER, userId));
|
||||
|
||||
// 获取关注列表
|
||||
List<Map<String, Object>> userList = followService.findFollowers(userId, page.getOffset(), page.getLimit());
|
||||
|
||||
if (userList != null) {
|
||||
for (Map<String, Object> map : userList) {
|
||||
User u = (User) map.get("user"); // 被关注的用户
|
||||
map.put("hasFollowed", hasFollowed(u.getId())); // 判断当前登录用户是否已关注这个关注列表中的某个用户
|
||||
}
|
||||
}
|
||||
|
||||
model.addAttribute("users", userList);
|
||||
|
||||
return "/site/follower";
|
||||
}
|
||||
|
||||
/**
|
||||
* 判断当前登录用户是否已关注某个用户
|
||||
* @param userId 某个用户
|
||||
* @return
|
||||
*/
|
||||
private boolean hasFollowed(int userId) {
|
||||
if (hostHolder.getUser() == null) {
|
||||
return false;
|
||||
}
|
||||
|
||||
return followService.hasFollowed(hostHolder.getUser().getId(), ENTITY_TYPE_USER, userId);
|
||||
}
|
||||
```
|
||||
|
||||
## 前端
|
||||
|
||||
```html
|
||||
<a th:href="@{|/followees/${user.id}|}" >
|
||||
关注了 <span th:text="${followeeCount}"></span> 人
|
||||
</a>
|
||||
<a th:href="@{|/followers/${user.id}|}">
|
||||
关注者 <span th:text="${followerCount}"></span> 人
|
||||
</a>
|
||||
```
|
||||
|
||||
对应的关注界面 `followee.html` 和粉丝界面 `follower.html`,以 `followee.html` 为例:
|
||||
|
||||
```html
|
||||
<li th:each="map:${users}">
|
||||
<a th:href="@{|/user/profile/${map.user.id}|}">
|
||||
<img th:src="${map.user.headerUrl}" alt="用户头像" >
|
||||
</a>
|
||||
|
||||
<span th:text="${map.user.username}"></span>
|
||||
<span >
|
||||
关注于 <i th:text="${#dates.format(map.followTime, 'yyyy-MM-dd HH:mm:ss')}"></i></span>
|
||||
|
||||
|
||||
<input type="hidden" id="entityId" th:value="${map.user.id}">
|
||||
<button type="button" th:class="|btn ${map.hasFollowed ? 'btn-secondary' : 'btn-info'} btn-sm float-right mr-5 follow-btn|"
|
||||
th:text="${map.hasFollowed ? '已关注' : '关注TA'}"
|
||||
th:if="${loginUser!=null && loginUser.id!=map.user.id}"></button>
|
||||
</div>
|
||||
</li>
|
||||
```
|
||||
|
@ -1,226 +0,0 @@
|
||||
# 优化登录模块
|
||||
|
||||
---
|
||||
|
||||
## 使用 Redis 短暂存储验证码
|
||||
|
||||
- 验证码需要频繁刷新,对性能要求较高
|
||||
- 验证码不需永久保存(设置在 Cookie 和 Redis 中的保留时间为 60 s)
|
||||
- 分布式部署,存在 Session 共享问题
|
||||
|
||||
> 原来我们是将验证码存在 session 中,这里我们将其存入 redis 中
|
||||
>
|
||||
> 存入 redis 中的数据 key(kaptcha:随机字符串) value(验证码)
|
||||
>
|
||||
> 用户在输入验证码的时候,还没有进行登录,我们无法通过用户的 id 指明这个验证码是针对谁的。所以这里我们随机生成一个字符串,将其短暂的存入 cookie,使用这个字符串来标识这个用户
|
||||
|
||||
工具类:生成 Redis 的 Key
|
||||
|
||||
```java
|
||||
/**
|
||||
* 生成 Redis 的 key
|
||||
*/
|
||||
public class RedisKeyUtil {
|
||||
|
||||
private static final String SPLIT = ":";
|
||||
|
||||
private static final String PREFIX_KAPTCHA = "kaptcha"; // 验证码
|
||||
|
||||
/**
|
||||
* 登录验证码(指定这个验证码是针对哪个用户的)
|
||||
* @param owner 用户进入登录页面的时候,由于此时用户还未登录,无法通过 id 标识用户
|
||||
* 随机生成一个字符串,短暂的存入 cookie,使用这个字符串来标识这个用户
|
||||
* @return
|
||||
*/
|
||||
public static String getKaptchaKey(String owner) {
|
||||
return PREFIX_KAPTCHA + SPLIT + owner;
|
||||
}
|
||||
|
||||
}
|
||||
```
|
||||
|
||||
修改生成验证码的方法:
|
||||
|
||||
```java
|
||||
/**
|
||||
* 生成验证码
|
||||
* @param response
|
||||
* @param session
|
||||
*/
|
||||
@GetMapping("/kaptcha")
|
||||
public void getKaptcha(HttpServletResponse response, HttpSession session) {
|
||||
// 生成验证码
|
||||
String text = kaptchaProducer.createText(); // 生成随机字符
|
||||
System.out.println("验证码:" + text);
|
||||
BufferedImage image = kaptchaProducer.createImage(text); // 生成图片
|
||||
|
||||
// 将验证码存入 session
|
||||
// session.setAttribute("kaptcha", text);
|
||||
|
||||
// 验证码的归属者
|
||||
String kaptchaOwner = CommunityUtil.generateUUID();
|
||||
Cookie cookie = new Cookie("kaptchaOwner", kaptchaOwner);
|
||||
cookie.setMaxAge(60);
|
||||
cookie.setPath(contextPath);
|
||||
response.addCookie(cookie);
|
||||
// 将验证码存入 redis
|
||||
String redisKey = RedisKeyUtil.getKaptchaKey(kaptchaOwner);
|
||||
redisTemplate.opsForValue().set(redisKey, text, 60, TimeUnit.SECONDS);
|
||||
|
||||
// 将图片输出给浏览器
|
||||
response.setContentType("image/png");
|
||||
try {
|
||||
ServletOutputStream os = response.getOutputStream();
|
||||
ImageIO.write(image, "png", os);
|
||||
} catch (IOException e) {
|
||||
logger.error("响应验证码失败", e.getMessage());
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## 使用 Redis 存储登录凭证
|
||||
|
||||
- 处理每次请求时,都要查询用户的登录凭证,访问的频率非常高(登录凭证这个数据不需要删除,永久保存在 redis 中)
|
||||
|
||||
工具类:生成 Redis 的 Key
|
||||
|
||||
```java
|
||||
/**
|
||||
* 生成 Redis 的 key
|
||||
*/
|
||||
public class RedisKeyUtil {
|
||||
|
||||
private static final String SPLIT = ":";
|
||||
|
||||
private static final String PREFIX_TICKET = "ticket"; // 登录凭证
|
||||
|
||||
|
||||
/**
|
||||
* 登陆凭证
|
||||
* @param ticket
|
||||
* @return
|
||||
*/
|
||||
public static String getTicketKey(String ticket) {
|
||||
return PREFIX_TICKET + SPLIT + ticket;
|
||||
}
|
||||
|
||||
}
|
||||
```
|
||||
|
||||
**LoginTicket 这张表以及相关操作可以废弃了**,不过我们最好不要直接就删除了。在 `LoginMapper` 类上面加上 `@Deprecated` 注解表示不推荐使用就好了。
|
||||
|
||||
修改 UserService 中生成/修改/查询登录凭证这三个方法
|
||||
|
||||
```java
|
||||
// 生成登录凭证
|
||||
String redisKey = RedisKeyUtil.getTicketKey(loginTicket.getTicket());
|
||||
redisTemplate.opsForValue().set(redisKey, loginTicket);
|
||||
|
||||
/**
|
||||
* 用户退出(将凭证状态设为无效)
|
||||
* @param ticket
|
||||
*/
|
||||
public void logout(String ticket) {
|
||||
// loginTicketMapper.updateStatus(ticket, 1);
|
||||
// 修改(先删除再插入)对应用户在 redis 中的凭证
|
||||
String redisKey = RedisKeyUtil.getTicketKey(ticket);
|
||||
LoginTicket loginTicket = (LoginTicket) redisTemplate.opsForValue().get(redisKey);
|
||||
loginTicket.setStatus(1);
|
||||
redisTemplate.opsForValue().set(redisKey, loginTicket);
|
||||
}
|
||||
|
||||
/**
|
||||
* 根据 ticket 查询 LoginTicket 信息
|
||||
* @param ticket
|
||||
* @return
|
||||
*/
|
||||
public LoginTicket findLoginTicket(String ticket) {
|
||||
// return loginTicketMapper.selectByTicket(ticket);
|
||||
String redisKey = RedisKeyUtil.getTicketKey(ticket);
|
||||
return (LoginTicket) redisTemplate.opsForValue().get(redisKey);
|
||||
}
|
||||
```
|
||||
|
||||
## 使用 Redis 缓存用户信息
|
||||
|
||||
- 处理每次请求时,都要根据凭证查询用户信息,访问的频率非常高(用户信息只在 Redis 中保存一段时间)
|
||||
|
||||
在 UserService 中添加以下三个方法:
|
||||
|
||||
- 优先从缓存中取值;
|
||||
- 缓存中没有该用户信息时,则将其存入缓存;
|
||||
- 用户信息变更时清除对应缓存数据;
|
||||
|
||||
```java
|
||||
/**
|
||||
* 优先从缓存中取值
|
||||
* @param userId
|
||||
* @return
|
||||
*/
|
||||
private User getCache(int userId) {
|
||||
String redisKey = RedisKeyUtil.getUserKey(userId);
|
||||
return (User) redisTemplate.opsForValue().get(redisKey);
|
||||
}
|
||||
|
||||
/**
|
||||
* 缓存中没有该用户信息时,则将其存入缓存
|
||||
* @param userId
|
||||
* @return
|
||||
*/
|
||||
private User initCache(int userId) {
|
||||
User user = userMapper.selectById(userId);
|
||||
String redisKey = RedisKeyUtil.getUserKey(userId);
|
||||
redisTemplate.opsForValue().set(redisKey, user, 3600, TimeUnit.SECONDS);
|
||||
return user;
|
||||
}
|
||||
|
||||
/**
|
||||
* 用户信息变更时清除对应缓存数据
|
||||
* @param userId
|
||||
*/
|
||||
private void clearCache(int userId) {
|
||||
String redisKey = RedisKeyUtil.getUserKey(userId);
|
||||
redisTemplate.delete(redisKey);
|
||||
}
|
||||
```
|
||||
|
||||
在修改 UserService 中其他方法的逻辑,将上面三个方法加入
|
||||
|
||||
```java
|
||||
/**
|
||||
* 根据 Id 查询用户
|
||||
* @param id
|
||||
* @return
|
||||
*/
|
||||
public User findUserById (int id) {
|
||||
// return userMapper.selectById(id);
|
||||
User user = getCache(id); // 优先从缓存中查询数据
|
||||
if (user == null) {
|
||||
user = initCache(id);
|
||||
}
|
||||
return user;
|
||||
}
|
||||
|
||||
/**
|
||||
* 激活用户
|
||||
* @param userId 用户 id
|
||||
* @param code 激活码
|
||||
* @return
|
||||
*/
|
||||
public int activation(int userId, String code) {
|
||||
User user = userMapper.selectById(userId);
|
||||
if (user.getStatus() == 1) {
|
||||
// 用户已激活
|
||||
return ACTIVATION_REPEAT;
|
||||
}
|
||||
else if (user.getActivationCode().equals(code)) {
|
||||
// 修改用户状态为已激活
|
||||
userMapper.updateStatus(userId, 1);
|
||||
clearCache(userId); // 用户信息变更,清除缓存中的旧数据
|
||||
return ACTIVATION_SUCCESS;
|
||||
}
|
||||
else {
|
||||
return ACTIVATION_FAILURE;
|
||||
}
|
||||
}
|
||||
```
|
@ -1,329 +0,0 @@
|
||||
# 发送系统通知
|
||||
|
||||
---
|
||||
|
||||
点赞、关注、私信等系统都会发送通知,在流量巨大的社交网站中,这个系统通知的需求是非常庞大的,为保证系统性能,使用消息队列 Kafka 构建 TB 级异步消息系统。
|
||||
|
||||
> 掌握 Java 原生 API 阻塞队列
|
||||
>
|
||||
> 
|
||||
|
||||
下载安装 Kafka(Kafka 自带 Zookeeper,对其配置文件进行相应修改)
|
||||
|
||||
<img src="https://gitee.com/veal98/images/raw/master/img/20210127211208.png" style="zoom: 50%;" />
|
||||
|
||||
<img src="https://gitee.com/veal98/images/raw/master/img/20210127211350.png" style="zoom:50%;" />
|
||||
|
||||
1)首先:启动 kafka:
|
||||
|
||||
第一步:
|
||||
|
||||

|
||||
|
||||
```shell
|
||||
cd d:\kafka_2.13-2.7.0
|
||||
|
||||
bin\windows\zookeeper-server-start.bat config\zookeeper.properties
|
||||
```
|
||||
|
||||
第二步:开启另一个命令行
|
||||
|
||||

|
||||
|
||||
```shell
|
||||
cd d:\kafka_2.13-2.7.0
|
||||
|
||||
bin\windows\kafka-server-start.bat config\server.properties
|
||||
```
|
||||
|
||||
2)然后,开启另一个命令行,创建主题
|
||||
|
||||

|
||||
|
||||
3)生产者生产消息
|
||||
|
||||

|
||||
|
||||
4)开启另一个命令行,消费者读取消息
|
||||
|
||||

|
||||
|
||||
## Spring 整合 Kafka
|
||||
|
||||
### 引入依赖
|
||||
|
||||
### 配置 Kafka
|
||||
|
||||
- 配置 server、consumer
|
||||
|
||||
## 访问 Kafka
|
||||
|
||||
- 生产者
|
||||
|
||||
```java
|
||||
KafkaTemplate.send(topic, data)
|
||||
```
|
||||
|
||||
- 消费者
|
||||
|
||||
```java
|
||||
@KafkaListener(topics = {"test"})
|
||||
public void handleMessage(ConsumerRecord record){ }
|
||||
```
|
||||
|
||||
|
||||
## 发送系统通知的需求
|
||||
|
||||

|
||||
|
||||
系统通知也使用私信那张表 `message`,不过 `from_id` 固定为 1,表示是系统发送出来的,注意在 user 表中存储这个系统用户
|
||||
|
||||
### 封装事件对象
|
||||
|
||||
```java
|
||||
/**
|
||||
* 封装事件(用于系统通知)
|
||||
*/
|
||||
public class Event {
|
||||
|
||||
private String topic; // 事件类型
|
||||
private int userId; // 事件由谁触发
|
||||
private int entityType; // 实体类型
|
||||
private int entityId; // 实体 id
|
||||
private int entityUserId; // 实体的作者
|
||||
private Map<String, Object> data = new HashMap<>(); // 存储未来可能需要用到的数据
|
||||
|
||||
public String getTopic() {
|
||||
return topic;
|
||||
}
|
||||
|
||||
public Event setTopic(String topic) {
|
||||
this.topic = topic;
|
||||
return this;
|
||||
}
|
||||
|
||||
public int getUserId() {
|
||||
return userId;
|
||||
}
|
||||
|
||||
public Event setUserId(int userId) {
|
||||
this.userId = userId;
|
||||
return this;
|
||||
}
|
||||
|
||||
public int getEntityType() {
|
||||
return entityType;
|
||||
}
|
||||
|
||||
public Event setEntityType(int entityType) {
|
||||
this.entityType = entityType;
|
||||
return this;
|
||||
}
|
||||
|
||||
public int getEntityId() {
|
||||
return entityId;
|
||||
}
|
||||
|
||||
public Event setEntityId(int entityId) {
|
||||
this.entityId = entityId;
|
||||
return this;
|
||||
}
|
||||
|
||||
public int getEntityUserId() {
|
||||
return entityUserId;
|
||||
}
|
||||
|
||||
public Event setEntityUserId(int entityUserId) {
|
||||
this.entityUserId = entityUserId;
|
||||
return this;
|
||||
}
|
||||
|
||||
public Map<String, Object> getData() {
|
||||
return data;
|
||||
}
|
||||
|
||||
public Event setData(String key, Object value) {
|
||||
this.data.put(key, value);
|
||||
return this;
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
小窍门,将 set 方法设置实体类型的返回值,就可以链式调用
|
||||
|
||||
解释一下上面的 `userId` 和 `entityUserId` :比如张三给李四点赞了,那么 `userId` 就是张三的 id,系统通知是发送给李四的,即 `entityUserId` 就是李四的 id.
|
||||
|
||||
### 事件的生产者
|
||||
|
||||
```java
|
||||
/**
|
||||
* 事件的生产者
|
||||
*/
|
||||
@Component
|
||||
public class EventProducer {
|
||||
|
||||
@Autowired
|
||||
private KafkaTemplate kafkaTemplate;
|
||||
|
||||
/**
|
||||
* 处理事件
|
||||
* @param event
|
||||
*/
|
||||
public void fireEvent(Event event) {
|
||||
// 将事件发布到指定的主题
|
||||
kafkaTemplate.send(event.getTopic(), JSONObject.toJSONString(event));
|
||||
}
|
||||
|
||||
}
|
||||
```
|
||||
|
||||
### 事件的消费者
|
||||
|
||||
```java
|
||||
/**
|
||||
* 事件消费者
|
||||
*/
|
||||
@Component
|
||||
public class EventConsumer implements CommunityConstant {
|
||||
|
||||
private static final Logger logger = LoggerFactory.getLogger(EventConsumer.class);
|
||||
|
||||
@Autowired
|
||||
private MessageService messageService;
|
||||
|
||||
@KafkaListener(topics = {TOPIC_COMMNET, TOPIC_LIKE, TOPIC_FOLLOW})
|
||||
public void handleMessage(ConsumerRecord record) {
|
||||
if (record == null || record.value() == null) {
|
||||
logger.error("消息的内容为空");
|
||||
return ;
|
||||
}
|
||||
Event event = JSONObject.parseObject(record.value().toString(), Event.class);
|
||||
if (event == null) {
|
||||
logger.error("消息格式错误");
|
||||
return ;
|
||||
}
|
||||
|
||||
// 发送系统通知
|
||||
Message message = new Message();
|
||||
message.setFromId(SYSTEM_USER_ID);
|
||||
message.setToId(event.getEntityUserId());
|
||||
message.setConversationId(event.getTopic());
|
||||
message.setCreateTime(new Date());
|
||||
|
||||
Map<String, Object> content = new HashMap<>();
|
||||
content.put("userId", event.getUserId());
|
||||
content.put("entityType", event.getEntityType());
|
||||
content.put("entityId", event.getEntityId());
|
||||
if (!event.getData().isEmpty()) {
|
||||
for (Map.Entry<String, Object> entry : event.getData().entrySet()) {
|
||||
content.put(entry.getKey(), entry.getValue());
|
||||
}
|
||||
}
|
||||
message.setContent(JSONObject.toJSONString(content));
|
||||
|
||||
messageService.addMessage(message);
|
||||
|
||||
}
|
||||
|
||||
}
|
||||
```
|
||||
|
||||
存储在 content 中的内容是 JSON 格式的,方便我们后续的读取。
|
||||
|
||||
### 修改表现层逻辑
|
||||
|
||||
- 点击评论主题的系统通知(某个用户评论了你的帖子/评论/回复),进入该条评论所属的帖子详情页
|
||||
- 点击点赞主题的系统通过(某个用户点赞了你的帖子/评论/回复),进入该条点赞所属的帖子详情页
|
||||
- 点击关注主题的系统通知(某个用户关注了你),进入该用户的个人主页
|
||||
|
||||
在表现层添加触发事件(发送系统通知)的逻辑,以 `CommnetController` 为例:
|
||||
|
||||
```java
|
||||
/**
|
||||
* 添加评论
|
||||
* @param discussPostId
|
||||
* @param comment
|
||||
* @return
|
||||
*/
|
||||
@PostMapping("/add/{discussPostId}")
|
||||
public String addComment(@PathVariable("discussPostId") int discussPostId, Comment comment) {
|
||||
comment.setUserId(hostHolder.getUser().getId());
|
||||
comment.setStatus(0);
|
||||
comment.setCreateTime(new Date());
|
||||
commentService.addComment(comment);
|
||||
|
||||
// 触发评论事件(系统通知)
|
||||
Event event = new Event()
|
||||
.setTopic(TOPIC_COMMNET)
|
||||
.setUserId(hostHolder.getUser().getId())
|
||||
.setEntityType(comment.getEntityType())
|
||||
.setEntityId(comment.getEntityId())
|
||||
.setData("postId", discussPostId);
|
||||
if (comment.getEntityType() == ENTITY_TYPE_POST) {
|
||||
DiscussPost target = discussPostSerivce.findDiscussPostById(comment.getEntityId());
|
||||
event.setEntityUserId(target.getUserId());
|
||||
}
|
||||
else if (comment.getEntityType() == ENTITY_TYPE_COMMENT) {
|
||||
Comment target = commentService.findCommentById(comment.getEntityId());
|
||||
event.setEntityUserId(target.getUserId());
|
||||
}
|
||||
|
||||
eventProducer.fireEvent(event);
|
||||
|
||||
return "redirect:/discuss/detail/" + discussPostId;
|
||||
}
|
||||
```
|
||||
|
||||
点击评论的系统通知后,进入该条评论所属的帖子
|
||||
|
||||
注意,对 `LikeController` 我们进行了微小的重构:
|
||||
|
||||
```java
|
||||
/**
|
||||
* 点赞
|
||||
* @param entityType
|
||||
* @param entityId
|
||||
* @param entityUserId 赞的帖子/评论的作者 id
|
||||
* @param postId 帖子的 id (点赞了哪个帖子,点赞的评论属于哪个帖子,点赞的回复属于哪个帖子)
|
||||
* @return
|
||||
*/
|
||||
@PostMapping("/like")
|
||||
@ResponseBody
|
||||
public String like(int entityType, int entityId, int entityUserId, int postId) {
|
||||
User user = hostHolder.getUser();
|
||||
// 点赞
|
||||
likeService.like(user.getId(), entityType, entityId, entityUserId);
|
||||
// 点赞数量
|
||||
long likeCount = likeService.findEntityLikeCount(entityType, entityId);
|
||||
// 点赞状态
|
||||
int likeStatus = likeService.findEntityLikeStatus(user.getId(), entityType, entityId);
|
||||
|
||||
Map<String, Object> map = new HashMap<>();
|
||||
map.put("likeCount", likeCount);
|
||||
map.put("likeStatus", likeStatus);
|
||||
|
||||
// 触发点赞事件(系统通知) - 取消点赞不通知
|
||||
if (likeStatus == 1) {
|
||||
Event event = new Event()
|
||||
.setTopic(TOPIC_LIKE)
|
||||
.setUserId(hostHolder.getUser().getId())
|
||||
.setEntityType(entityType)
|
||||
.setEntityId(entityId)
|
||||
.setEntityUserId(entityUserId)
|
||||
.setData("postId", postId);
|
||||
eventProducer.fireEvent(event);
|
||||
}
|
||||
|
||||
return CommunityUtil.getJSONString(0, null, map);
|
||||
}
|
||||
```
|
||||
|
||||
方法参数中添加了 帖子的 id,主要是为了:无论是对帖子的点赞,还是对某个帖子评论/回复的点赞,点击该条系统通知后都需要进入对应的帖子。所以此处我们需要传入帖子的 id,并非和 entityId 重复
|
||||
|
||||
## 注意
|
||||
|
||||
注意,修改一下 `ServiceLogAspect` 中的逻辑(统一日志记录),加入一个 `ServletRequestAttributes` 非空的判断:
|
||||
|
||||

|
||||
|
||||
这个方法拦截了所有的 Service,而在本节之前我们所有对 Service 的访问都是通过 Controller 的,但是!现在我们多出了一个消费者,它调用了 `MessageService`,不是通过 Controller 去调用的,也就是说在消费者的调用中,是不存在 request 的,也即 `ServletRequestAttributes` 为空。
|
@ -1,214 +0,0 @@
|
||||
# 显示系统通知
|
||||
|
||||
---
|
||||
|
||||

|
||||
|
||||
## DAO
|
||||
|
||||
```xml
|
||||
int selectNoticUnReadCount(int userId, String topic);
|
||||
|
||||
<!--查询未读的系统通知数量-->
|
||||
<select id="selectNoticUnReadCount" resultType="int">
|
||||
select count(id)
|
||||
from message
|
||||
where status = 0
|
||||
and from_id = 1
|
||||
and to_id = #{userId}
|
||||
<if test = "topic != null">
|
||||
and conversation_id = #{topic}
|
||||
</if>
|
||||
</select>
|
||||
```
|
||||
|
||||
动态查询,如果 `selectNoticUnReadCount` 不传入 topic 参数,则查询所有系统通知的未读数量。
|
||||
|
||||
## Service
|
||||
|
||||
```java
|
||||
/**
|
||||
* 查询某个主题下最新的系统通知
|
||||
* @param userId
|
||||
* @param topic
|
||||
* @return
|
||||
*/
|
||||
public Message findLatestNotice(int userId, String topic) {
|
||||
return messageMapper.selectLatestNotice(userId, topic);
|
||||
}
|
||||
|
||||
/**
|
||||
* 查询某个主题下包含的系统通知数量
|
||||
* @param userId
|
||||
* @param topic
|
||||
* @return
|
||||
*/
|
||||
public int findNoticeCount(int userId, String topic) {
|
||||
return messageMapper.selectNoticeCount(userId, topic);
|
||||
}
|
||||
|
||||
/**
|
||||
* 查询未读的系统通知数量
|
||||
* @param userId
|
||||
* @param topic
|
||||
* @return
|
||||
*/
|
||||
public int findNoticeUnReadCount(int userId, String topic) {
|
||||
return messageMapper.selectNoticeUnReadCount(userId, topic);
|
||||
}
|
||||
|
||||
/**
|
||||
* 查询某个主题所包含的通知列表
|
||||
* @param userId
|
||||
* @param topic
|
||||
* @param offset
|
||||
* @param limit
|
||||
* @return
|
||||
*/
|
||||
public List<Message> findNotices(int userId, String topic, int offset, int limit) {
|
||||
return messageMapper.selectNotices(userId, topic, offset, limit);
|
||||
}
|
||||
```
|
||||
|
||||
## Controller
|
||||
|
||||
```java
|
||||
/**
|
||||
* 通知列表(只显示最新一条消息)
|
||||
* @param model
|
||||
* @return
|
||||
*/
|
||||
@GetMapping("/notice/list")
|
||||
public String getNoticeList(Model model) {
|
||||
User user = hostHolder.getUser();
|
||||
|
||||
// 查询评论类通知
|
||||
Message message = messageService.findLatestNotice(user.getId(), TOPIC_COMMNET);
|
||||
// 封装通知需要的各种数据
|
||||
if (message != null) {
|
||||
Map<String, Object> messageVO = new HashMap<>();
|
||||
|
||||
messageVO.put("message", message);
|
||||
|
||||
String content = HtmlUtils.htmlUnescape(message.getContent());
|
||||
Map<String, Object> data = JSONObject.parseObject(content, HashMap.class);
|
||||
|
||||
messageVO.put("user", userService.findUserById((Integer) data.get("userId")));
|
||||
messageVO.put("entityType", data.get("entityType"));
|
||||
messageVO.put("entityId", data.get("entityId"));
|
||||
messageVO.put("postId", data.get("postId"));
|
||||
|
||||
int count = messageService.findNoticeCount(user.getId(), TOPIC_COMMNET);
|
||||
messageVO.put("count", count);
|
||||
|
||||
int unread = messageService.findNoticeUnReadCount(user.getId(), TOPIC_COMMNET);
|
||||
messageVO.put("unread", unread);
|
||||
|
||||
model.addAttribute("commentNotice", messageVO);
|
||||
}
|
||||
|
||||
// 查询点赞类通知
|
||||
...........
|
||||
// 查询关注类通知
|
||||
...........
|
||||
|
||||
// 查询未读消息数量
|
||||
int letterUnreadCount = messageService.findLetterUnreadCount(user.getId(), null);
|
||||
model.addAttribute("letterUnreadCount", letterUnreadCount);
|
||||
int noticeUnreadCount = messageService.findNoticeUnReadCount(user.getId(), null);
|
||||
model.addAttribute("noticeUnreadCount", noticeUnreadCount);
|
||||
|
||||
return "/site/notice";
|
||||
}
|
||||
|
||||
/**
|
||||
* 查询某个主题所包含的通知列表
|
||||
* @param topic
|
||||
* @param page
|
||||
* @param model
|
||||
* @return
|
||||
*/
|
||||
@GetMapping("/notice/detail/{topic}")
|
||||
public String getNoticeDetail(@PathVariable("topic") String topic, Page page, Model model) {
|
||||
User user = hostHolder.getUser();
|
||||
|
||||
page.setLimit(5);
|
||||
page.setPath("/notice/detail/" + topic);
|
||||
page.setRows(messageService.findNoticeCount(user.getId(), topic));
|
||||
|
||||
List<Message> noticeList = messageService.findNotices(user.getId(), topic,page.getOffset(), page.getLimit());
|
||||
List<Map<String, Object>> noticeVoList = new ArrayList<>();
|
||||
if (noticeList != null) {
|
||||
for (Message notice : noticeList) {
|
||||
Map<String, Object> map = new HashMap<>();
|
||||
// 通知
|
||||
map.put("notice", notice);
|
||||
// 内容
|
||||
String content = HtmlUtils.htmlUnescape(notice.getContent());
|
||||
Map<String, Object> data = JSONObject.parseObject(content, HashMap.class);
|
||||
map.put("user", userService.findUserById((Integer) data.get("userId")));
|
||||
map.put("entityType", data.get("entityType"));
|
||||
map.put("entityId", data.get("entityId"));
|
||||
map.put("postId", data.get("postId"));
|
||||
// 发送系统通知的作者
|
||||
map.put("fromUser", userService.findUserById(notice.getFromId()));
|
||||
|
||||
noticeVoList.add(map);
|
||||
}
|
||||
}
|
||||
model.addAttribute("notices", noticeVoList);
|
||||
|
||||
// 设置已读
|
||||
List<Integer> ids = getUnreadLetterIds(noticeList);
|
||||
if (!ids.isEmpty()) {
|
||||
messageService.readMessage(ids);
|
||||
}
|
||||
|
||||
return "/site/notice-detail";
|
||||
}
|
||||
```
|
||||
|
||||
存储在 message 表中的系统通知的 content 字段是 JSON 格式的
|
||||
|
||||

|
||||
|
||||
我们需要获取这个 JSON 字符串,将其转化成对象然后转换成一条通知
|
||||
|
||||
前端的修改此处就不写了
|
||||
|
||||
## 拦截器
|
||||
|
||||
这里需要注意一下导航栏上面的未读消息数量(未读私信 + 未读系统通知)的实时更新。使用拦截器实现,在 Controller 之后模板之前调用:
|
||||
|
||||
```java
|
||||
@Component
|
||||
public class MessageInterceptor implements HandlerInterceptor {
|
||||
|
||||
@Autowired
|
||||
private HostHolder hostHolder;
|
||||
|
||||
@Autowired
|
||||
private MessageService messageService;
|
||||
|
||||
/**
|
||||
* Controller之后模板之前被调用
|
||||
* 获取未读私信/系统通知的数量
|
||||
* @param request
|
||||
* @param response
|
||||
* @param handler
|
||||
* @param modelAndView
|
||||
* @throws Exception
|
||||
*/
|
||||
@Override
|
||||
public void postHandle(HttpServletRequest request, HttpServletResponse response, Object handler, ModelAndView modelAndView) throws Exception {
|
||||
User user = hostHolder.getUser();
|
||||
if (user != null && modelAndView != null) {
|
||||
int letterUnreadCount = messageService.findLetterUnreadCount(user.getId(), null);
|
||||
int noticeUnreadCount = messageService.findNoticeUnReadCount(user.getId(), null);
|
||||
modelAndView.addObject("allUnreadCount", letterUnreadCount + noticeUnreadCount);
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
别忘记在 `WebMvcConfig` 中配置该拦截器
|
576
docs/250-搜索.md
576
docs/250-搜索.md
@ -1,576 +0,0 @@
|
||||
# 搜索
|
||||
|
||||
---
|
||||
|
||||
想要在 Elasticsearch 中搜索数据,需要把数据在 Elasticsearch 中再存一份
|
||||
|
||||

|
||||
|
||||
一个索引对应一个数据库(7.0 中索引对应表),一个类型对应一张表(7.0 已废弃该属性),一个文档对应表中的一行,一个字段对应表中的一列
|
||||
|
||||
## ElasticSearch 下载安装
|
||||
|
||||
注意,下载 ElasticSearch 版本一定要与你的 SpringBoot 版本内部规定的一致,我的是 SpringBoot 2.1.5
|
||||
|
||||

|
||||
|
||||
[Elasticsearch 6.4.3 下载地址](https://www.elastic.co/cn/downloads/past-releases/elasticsearch-6-4-3)
|
||||
|
||||
解压完毕后,需要配置一下:config/elasticsearch.yml
|
||||
|
||||
<img src="https://gitee.com/veal98/images/raw/master/img/20210129111542.png" style="zoom:50%;" />
|
||||
|
||||
配到环境变量中去:
|
||||
|
||||
<img src="https://gitee.com/veal98/images/raw/master/img/20210129112143.png" style="zoom:50%;" />
|
||||
|
||||
还需要安装一个**中文分词插件**(Elasticsearch 自带一个英文分词插件)[elasticsearch-analysis-ik 6.4.3 下载地址](https://github.com/medcl/elasticsearch-analysis-ik/releases/tag/v6.4.3)
|
||||
|
||||
注意:必须解压到你的 elasticsearch 安装目录的 plugins/ik 文件夹下(D:\elasticsearch-6.4.3\plugins\ik)
|
||||
|
||||
|
||||
|
||||
启动 elasticsearch
|
||||
|
||||
<img src="https://gitee.com/veal98/images/raw/master/img/20210129113947.png" style="zoom:50%;" />
|
||||
|
||||
常用命令:
|
||||
|
||||
```shell
|
||||
curl -X PUT "localhost:9200/test" 创建索引
|
||||
|
||||
curl -X GET "localhost:9200/_cat/indices?v" 查看索引
|
||||
|
||||
curl -X DELETE "localhost:9200/test" 删除索引
|
||||
```
|
||||
|
||||

|
||||
|
||||
可以使用 Postman 简化命令行的操作,比如创建索引:
|
||||
|
||||

|
||||
|
||||
增加一条数据:
|
||||
|
||||

|
||||
|
||||
查询:
|
||||
|
||||

|
||||
|
||||
删除:
|
||||
|
||||

|
||||
|
||||
搜索:搜索 title 中包含“互联网”的内容
|
||||
|
||||

|
||||
|
||||
复杂搜索:搜索 title 或 content 中包含 “互联网” 的内容
|
||||
|
||||

|
||||
|
||||
## Spring Boot 整合 Elasticsearch
|
||||
|
||||
### 引入依赖
|
||||
|
||||
```xml
|
||||
<!--Elasticsearch-->
|
||||
<dependency>
|
||||
<groupId>org.springframework.boot</groupId>
|
||||
<artifactId>spring-boot-starter-data-elasticsearch</artifactId>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
### 配置 Elasticsearch
|
||||
|
||||
- cluster-name
|
||||
- cluster-nodes
|
||||
|
||||
```properties
|
||||
# Elasticsearch
|
||||
# 该字段见 Elasticsearch 安装包中的 elasticsearch.yml,可自行修改
|
||||
spring.data.elasticsearch.cluster-name = community
|
||||
spring.data.elasticsearch.cluster-nodes = 127.0.0.1:9300
|
||||
```
|
||||
|
||||
### 解决 Elasticsearch 和 Redis 底层的 Netty 启动冲突问题
|
||||
|
||||
```java
|
||||
@SpringBootApplication
|
||||
public class CommunityApplication {
|
||||
|
||||
/**
|
||||
* 解决 Elasticsearch 和 Redis 底层的 Netty 启动冲突问题
|
||||
*/
|
||||
@PostConstruct
|
||||
public void init() {
|
||||
System.setProperty("es.set.netty.runtime.available.processors", "false");
|
||||
}
|
||||
|
||||
public static void main(String[] args) {
|
||||
SpringApplication.run(CommunityApplication.class, args);
|
||||
}
|
||||
|
||||
}
|
||||
```
|
||||
|
||||
### Spring Data Elasticsearch
|
||||
|
||||
- ElasticsearchTemplate
|
||||
- ElasticsearchRepository(推荐)
|
||||
|
||||
首先,将实体类和 Elasticsearch 之间建立联系:
|
||||
|
||||
```java
|
||||
@Document(indexName = "discusspost", type = "_doc", shards = 6, replicas = 3)
|
||||
public class DiscussPost {
|
||||
```
|
||||
|
||||
type 类型在 elasticsearch 7 版本中已完全抛弃,6 也不太建议使用,所以我们就固定为 _doc,不用去管它
|
||||
|
||||
```java
|
||||
@Document(indexName = "discusspost", type = "_doc", shards = 6, replicas = 3)
|
||||
public class DiscussPost {
|
||||
|
||||
@Id
|
||||
private int id;
|
||||
|
||||
@Field(type = FieldType.Integer)
|
||||
private int userId;
|
||||
|
||||
@Field(type = FieldType.Text, analyzer = "ik_max_word", searchAnalyzer = "ik_smart")
|
||||
private String title;
|
||||
|
||||
@Field(type = FieldType.Text, analyzer = "ik_max_word", searchAnalyzer = "ik_smart")
|
||||
private String content;
|
||||
|
||||
@Field(type = FieldType.Integer)
|
||||
private int type;
|
||||
|
||||
@Field(type = FieldType.Integer)
|
||||
private int status;
|
||||
|
||||
@Field(type = FieldType.Date)
|
||||
private Date createTime;
|
||||
|
||||
@Field(type = FieldType.Integer)
|
||||
private int commentCount;
|
||||
|
||||
@Field(type = FieldType.Double)
|
||||
private double score;
|
||||
```
|
||||
|
||||
测试增删改查:
|
||||
|
||||
```java
|
||||
@RunWith(SpringRunner.class)
|
||||
@ContextConfiguration(classes = CommunityApplication.class)
|
||||
@SpringBootTest
|
||||
public class ElasticsearchTests {
|
||||
|
||||
@Autowired
|
||||
private DiscussPostMapper discussPostMapper;
|
||||
|
||||
@Autowired
|
||||
private DiscussPostRepository discussPostRepository;
|
||||
|
||||
@Autowired
|
||||
private ElasticsearchTemplate elasticsearchTemplate;
|
||||
|
||||
/**
|
||||
* 测试插入数据
|
||||
*/
|
||||
@Test
|
||||
public void testInsert() {
|
||||
discussPostRepository.save(discussPostMapper.selectDiscussPostById(241));
|
||||
discussPostRepository.save(discussPostMapper.selectDiscussPostById(242));
|
||||
discussPostRepository.save(discussPostMapper.selectDiscussPostById(243));
|
||||
}
|
||||
|
||||
/**
|
||||
* 测试批量插入数据
|
||||
*/
|
||||
@Test
|
||||
public void testInsetList() {
|
||||
discussPostRepository.saveAll(discussPostMapper.selectDiscussPosts(101, 0, 100));
|
||||
discussPostRepository.saveAll(discussPostMapper.selectDiscussPosts(102, 0, 100));
|
||||
discussPostRepository.saveAll(discussPostMapper.selectDiscussPosts(103, 0, 100));
|
||||
discussPostRepository.saveAll(discussPostMapper.selectDiscussPosts(111, 0, 100));
|
||||
discussPostRepository.saveAll(discussPostMapper.selectDiscussPosts(112, 0, 100));
|
||||
discussPostRepository.saveAll(discussPostMapper.selectDiscussPosts(131, 0, 100));
|
||||
discussPostRepository.saveAll(discussPostMapper.selectDiscussPosts(132, 0, 100));
|
||||
discussPostRepository.saveAll(discussPostMapper.selectDiscussPosts(133, 0, 100));
|
||||
discussPostRepository.saveAll(discussPostMapper.selectDiscussPosts(134, 0, 100));
|
||||
}
|
||||
|
||||
/**
|
||||
* 测试修改数据
|
||||
*/
|
||||
@Test
|
||||
public void testUpdate() {
|
||||
DiscussPost discussPost = discussPostMapper.selectDiscussPostById(231);
|
||||
discussPost.setContent("Great Elasticsearch");
|
||||
discussPostRepository.save(discussPost);
|
||||
}
|
||||
|
||||
/**
|
||||
* 测试删除数据(注意数据库中的数据并未被删除)
|
||||
*/
|
||||
@Test
|
||||
public void testDelete() {
|
||||
discussPostRepository.deleteById(231);
|
||||
// discussPostRepository.deleteAll();
|
||||
}
|
||||
|
||||
/**
|
||||
* 测试使用 ElasticsearchRepository 进行搜索
|
||||
*/
|
||||
@Test
|
||||
public void testSearchByRepository() {
|
||||
SearchQuery searchQuery = new NativeSearchQueryBuilder()
|
||||
.withQuery(QueryBuilders.multiMatchQuery("互联网寒冬", "title", "content"))
|
||||
.withSort(SortBuilders.fieldSort("type").order(SortOrder.DESC))
|
||||
.withSort(SortBuilders.fieldSort("score").order(SortOrder.DESC))
|
||||
.withSort(SortBuilders.fieldSort("createTime").order(SortOrder.DESC))
|
||||
.withPageable(PageRequest.of(0, 10))
|
||||
.withHighlightFields(
|
||||
new HighlightBuilder.Field("title").preTags("<em>").postTags("</em>"),
|
||||
new HighlightBuilder.Field("content").preTags("<em>").postTags("</em>")
|
||||
).build();
|
||||
|
||||
// elasticsearchTemplate.queryForPage(searchQuery, class, SearchResultMapper);
|
||||
// 底层获取到了高亮显示的值,但是没有做处理(所以想要更加完善的话需要使用 ElasticsearchTemplate)
|
||||
|
||||
Page<DiscussPost> page = discussPostRepository.search(searchQuery);
|
||||
|
||||
System.out.println(page.getTotalElements());
|
||||
System.out.println(page.getTotalPages());
|
||||
System.out.println(page.getNumber());
|
||||
System.out.println(page.getSize());
|
||||
for (DiscussPost post : page) {
|
||||
System.out.println(post);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* 测试使用 ElasticsearchTemplate 进行搜索
|
||||
*/
|
||||
@Test
|
||||
public void testSearchTemplate() {
|
||||
SearchQuery searchQuery = new NativeSearchQueryBuilder()
|
||||
.withQuery(QueryBuilders.multiMatchQuery("互联网寒冬", "title", "content"))
|
||||
.withSort(SortBuilders.fieldSort("type").order(SortOrder.DESC))
|
||||
.withSort(SortBuilders.fieldSort("score").order(SortOrder.DESC))
|
||||
.withSort(SortBuilders.fieldSort("createTime").order(SortOrder.DESC))
|
||||
.withPageable(PageRequest.of(0, 10))
|
||||
.withHighlightFields(
|
||||
new HighlightBuilder.Field("title").preTags("<em>").postTags("</em>"),
|
||||
new HighlightBuilder.Field("content").preTags("<em>").postTags("</em>")
|
||||
).build();
|
||||
|
||||
Page<DiscussPost> page = elasticsearchTemplate.queryForPage(searchQuery, DiscussPost.class, new SearchResultMapper() {
|
||||
@Override
|
||||
public <T> AggregatedPage<T> mapResults(SearchResponse searchResponse, Class<T> aClass, Pageable pageable) {
|
||||
SearchHits hits = searchResponse.getHits();
|
||||
if (hits.getTotalHits() <= 0) {
|
||||
return null;
|
||||
}
|
||||
|
||||
List<DiscussPost> list = new ArrayList<>();
|
||||
|
||||
for (SearchHit hit : hits) {
|
||||
DiscussPost post = new DiscussPost();
|
||||
|
||||
String id = hit.getSourceAsMap().get("id").toString();
|
||||
post.setId(Integer.valueOf(id));
|
||||
|
||||
String userId = hit.getSourceAsMap().get("userId").toString();
|
||||
post.setUserId(Integer.valueOf(userId));
|
||||
|
||||
String title = hit.getSourceAsMap().get("title").toString();
|
||||
post.setTitle(title);
|
||||
|
||||
String content = hit.getSourceAsMap().get("content").toString();
|
||||
post.setContent(content);
|
||||
|
||||
String status = hit.getSourceAsMap().get("status").toString();
|
||||
post.setStatus(Integer.valueOf(status));
|
||||
|
||||
String createTime = hit.getSourceAsMap().get("createTime").toString();
|
||||
post.setCreateTime(new Date(Long.valueOf(createTime)));
|
||||
|
||||
String commentCount = hit.getSourceAsMap().get("commentCount").toString();
|
||||
post.setCommentCount(Integer.valueOf(commentCount));
|
||||
|
||||
// 处理高亮显示的内容
|
||||
HighlightField titleField = hit.getHighlightFields().get("title");
|
||||
if (titleField != null) {
|
||||
post.setTitle(titleField.getFragments()[0].toString());
|
||||
}
|
||||
|
||||
HighlightField contentField = hit.getHighlightFields().get("content");
|
||||
if (contentField != null) {
|
||||
post.setContent(contentField.getFragments()[0].toString());
|
||||
}
|
||||
|
||||
list.add(post);
|
||||
}
|
||||
|
||||
return new AggregatedPageImpl(list, pageable,
|
||||
hits.getTotalHits(), searchResponse.getAggregations(), searchResponse.getScrollId(), hits.getMaxScore());
|
||||
}
|
||||
});
|
||||
|
||||
System.out.println(page.getTotalElements());
|
||||
System.out.println(page.getTotalPages());
|
||||
System.out.println(page.getNumber());
|
||||
System.out.println(page.getSize());
|
||||
for (DiscussPost post : page) {
|
||||
System.out.println(post);
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
```
|
||||
|
||||
🚨 注意这里面的 Page 是 Spring 提供的,而非我们自己开发的那个,它的 current 当前页码是从 0 开始计数的,而我们自己开发的那个 Page 是从 1 开始的,所以后续编码的时候记得稍微处理一下。
|
||||
|
||||
## 开发社区搜索功能
|
||||
|
||||
<img src="https://gitee.com/veal98/images/raw/master/img/20210129153411.png" style="zoom: 33%;" />
|
||||
|
||||
使用消息队列异步地(提高性能)将帖子提交到 Elasticsearch 提交到服务器
|
||||
|
||||
### Service
|
||||
|
||||
```java
|
||||
/**
|
||||
* 搜索相关
|
||||
*/
|
||||
@Service
|
||||
public class ElasticsearchService {
|
||||
|
||||
@Autowired
|
||||
private DiscussPostRepository discussPostRepository;
|
||||
|
||||
@Autowired
|
||||
private ElasticsearchTemplate elasticsearchTemplate;
|
||||
|
||||
/**
|
||||
* 将数据插入 Elasticsearch 服务器
|
||||
* @param post
|
||||
*/
|
||||
public void saveDiscusspost(DiscussPost post) {
|
||||
discussPostRepository.save(post);
|
||||
}
|
||||
|
||||
/**
|
||||
* 将数据从 Elasticsearch 服务器中删除
|
||||
* @param id
|
||||
*/
|
||||
public void deleteDiscusspost(int id) {
|
||||
discussPostRepository.deleteById(id);
|
||||
}
|
||||
|
||||
/**
|
||||
* 搜索
|
||||
* @param keyword 搜索的关键词
|
||||
* @param current 当前页码(这里的 Page 是 Spring 提供的,而非我们自己实现的那个)
|
||||
* @param limit 每页显示多少条数据
|
||||
* @return
|
||||
*/
|
||||
public Page<DiscussPost> searchDiscussPost(String keyword, int current, int limit) {
|
||||
SearchQuery searchQuery = new NativeSearchQueryBuilder()
|
||||
.withQuery(QueryBuilders.multiMatchQuery(keyword, "title", "content"))
|
||||
.withSort(SortBuilders.fieldSort("type").order(SortOrder.DESC))
|
||||
.withSort(SortBuilders.fieldSort("score").order(SortOrder.DESC))
|
||||
.withSort(SortBuilders.fieldSort("createTime").order(SortOrder.DESC))
|
||||
.withPageable(PageRequest.of(current, limit))
|
||||
.withHighlightFields(
|
||||
new HighlightBuilder.Field("title").preTags("<em>").postTags("</em>"),
|
||||
new HighlightBuilder.Field("content").preTags("<em>").postTags("</em>")
|
||||
).build();
|
||||
|
||||
return elasticsearchTemplate.queryForPage(searchQuery, DiscussPost.class, new SearchResultMapper() {
|
||||
@Override
|
||||
public <T> AggregatedPage<T> mapResults(SearchResponse searchResponse, Class<T> aClass, Pageable pageable) {
|
||||
// 获取命中的数据
|
||||
SearchHits hits = searchResponse.getHits();
|
||||
if (hits.getTotalHits() <= 0) {
|
||||
return null;
|
||||
}
|
||||
|
||||
// 处理命中的数据
|
||||
List<DiscussPost> list = new ArrayList<>();
|
||||
for (SearchHit hit : hits) {
|
||||
DiscussPost post = new DiscussPost();
|
||||
|
||||
String id = hit.getSourceAsMap().get("id").toString();
|
||||
post.setId(Integer.valueOf(id));
|
||||
|
||||
String userId = hit.getSourceAsMap().get("userId").toString();
|
||||
post.setUserId(Integer.valueOf(userId));
|
||||
|
||||
String title = hit.getSourceAsMap().get("title").toString();
|
||||
post.setTitle(title);
|
||||
|
||||
String content = hit.getSourceAsMap().get("content").toString();
|
||||
post.setContent(content);
|
||||
|
||||
String status = hit.getSourceAsMap().get("status").toString();
|
||||
post.setStatus(Integer.valueOf(status));
|
||||
|
||||
String createTime = hit.getSourceAsMap().get("createTime").toString();
|
||||
post.setCreateTime(new Date(Long.valueOf(createTime)));
|
||||
|
||||
String commentCount = hit.getSourceAsMap().get("commentCount").toString();
|
||||
post.setCommentCount(Integer.valueOf(commentCount));
|
||||
|
||||
// 处理高亮显示的内容
|
||||
HighlightField titleField = hit.getHighlightFields().get("title");
|
||||
if (titleField != null) {
|
||||
post.setTitle(titleField.getFragments()[0].toString());
|
||||
}
|
||||
|
||||
HighlightField contentField = hit.getHighlightFields().get("content");
|
||||
if (contentField != null) {
|
||||
post.setContent(contentField.getFragments()[0].toString());
|
||||
}
|
||||
|
||||
list.add(post);
|
||||
}
|
||||
|
||||
return new AggregatedPageImpl(list, pageable,
|
||||
hits.getTotalHits(), searchResponse.getAggregations(), searchResponse.getScrollId(), hits.getMaxScore());
|
||||
}
|
||||
});
|
||||
|
||||
}
|
||||
|
||||
}
|
||||
```
|
||||
|
||||
### Controller
|
||||
|
||||
- DiscusspostController
|
||||
|
||||
发帖成功后,通过消息队列将该帖子存入 Elasticsearch 服务器
|
||||
|
||||
<img src="https://gitee.com/veal98/images/raw/master/img/20210129155856.png" style="zoom: 50%;" />
|
||||
|
||||
- CommentController
|
||||
|
||||
对帖子添加评论成功后,通过消息队列将该帖子存入 Elasticsearch 服务器(不懂为啥要这样做,又不用查询评论)
|
||||
|
||||
<img src="https://gitee.com/veal98/images/raw/master/img/20210129160046.png" style="zoom:50%;" />
|
||||
|
||||
- 在消息队列消费者中添加一个消费发帖事件的方法
|
||||
|
||||
```java
|
||||
/**
|
||||
* 消费发帖事件
|
||||
*/
|
||||
@KafkaListener(topics = {TOPIC_PUBLISH})
|
||||
public void handlePublishMessage(ConsumerRecord record) {
|
||||
if (record == null || record.value() == null) {
|
||||
logger.error("消息的内容为空");
|
||||
return ;
|
||||
}
|
||||
Event event = JSONObject.parseObject(record.value().toString(), Event.class);
|
||||
if (event == null) {
|
||||
logger.error("消息格式错误");
|
||||
return ;
|
||||
}
|
||||
|
||||
DiscussPost post = discussPostSerivce.findDiscussPostById(event.getEntityId());
|
||||
elasticsearchService.saveDiscusspost(post);
|
||||
|
||||
}
|
||||
```
|
||||
|
||||
- **SearchController**
|
||||
|
||||
```java
|
||||
/**
|
||||
* 搜索
|
||||
*/
|
||||
@Controller
|
||||
public class SearchController implements CommunityConstant {
|
||||
|
||||
@Autowired
|
||||
private ElasticsearchService elasticsearchService;
|
||||
|
||||
@Autowired
|
||||
private UserService userService;
|
||||
|
||||
@Autowired
|
||||
private DiscussPostSerivce discussPostSerivce;
|
||||
|
||||
@Autowired
|
||||
private LikeService likeService;
|
||||
|
||||
// search?keword=xxx
|
||||
@GetMapping("/search")
|
||||
public String search(String keyword, Page page, Model model) {
|
||||
// 搜索帖子 (Spring 提供的 Page 当前页码从 0 开始计数)
|
||||
org.springframework.data.domain.Page<DiscussPost> searchResult =
|
||||
elasticsearchService.searchDiscussPost(keyword, page.getCurrent()-1, page.getLimit());
|
||||
// 聚合数据
|
||||
List<Map<String, Object>> discussPosts = new ArrayList<>();
|
||||
if (searchResult != null) {
|
||||
for (DiscussPost post : searchResult) {
|
||||
Map<String, Object> map = new HashMap<>();
|
||||
// 帖子
|
||||
map.put("post", post);
|
||||
// 作者
|
||||
map.put("user", userService.findUserById(post.getUserId()));
|
||||
// 点赞数量
|
||||
map.put("likeCount", likeService.findEntityLikeCount(ENTITY_TYPE_POST, post.getId()));
|
||||
|
||||
discussPosts.add(map);
|
||||
}
|
||||
}
|
||||
|
||||
model.addAttribute("discussPosts", discussPosts);
|
||||
model.addAttribute("keyword", keyword);
|
||||
|
||||
// 设置分页
|
||||
page.setPath("/search?keyword="+ keyword);
|
||||
page.setRows(searchResult == null ? 0 : (int) searchResult.getTotalElements());
|
||||
|
||||
return "/site/search";
|
||||
}
|
||||
|
||||
}
|
||||
```
|
||||
|
||||
### 前端
|
||||
|
||||
`index.html` 搜索框
|
||||
|
||||
```html
|
||||
<form method="get" th:action="@{/search}">
|
||||
<input name="keyword" th:value="${keyword}" />
|
||||
<button type="submit"> 搜索</button>
|
||||
</form>
|
||||
```
|
||||
|
||||
name = "keyword" 和 Controller 中参数的名称要一致
|
||||
|
||||
`search.html` 搜索详情页
|
||||
|
||||
```html
|
||||
<li th:each="map:${discussPosts}">
|
||||
<img th:src="${map.user.headerUrl}" alt="用户头像">
|
||||
<a th:href="@{|/discuss/detail/${map.post.id}|}" th:utext="${map.post.title}"></a>
|
||||
<div th:utext="${map.post.content}"></div>
|
||||
<u th:utext="${map.user.username}"></u>
|
||||
发布于 <b th:text="${#dates.format(map.post.createTime, 'yyyy-MM-dd HH:mm:ss')}"></b>
|
||||
<ul>
|
||||
<li>赞 <i th:text="${map.likeCount}"></i></li>
|
||||
<li>|</li>
|
||||
<li>回复 <i th:text="${map.post.commentCount}"></i></li>
|
||||
</ul>
|
||||
</li>
|
||||
```
|
166
docs/260-权限控制.md
166
docs/260-权限控制.md
@ -1,166 +0,0 @@
|
||||
# Spring Security 权限控制
|
||||
|
||||
---
|
||||
|
||||

|
||||
|
||||
导入 依赖:
|
||||
|
||||
```xml
|
||||
<!--Spring Security-->
|
||||
<dependency>
|
||||
<groupId>org.springframework.boot</groupId>
|
||||
<artifactId>spring-boot-starter-security</artifactId>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
## 授权配置
|
||||
|
||||

|
||||
|
||||
管理员能够访问的特殊路径后续再配
|
||||
|
||||
```java
|
||||
@Configuration
|
||||
public class SecurityConfig extends WebSecurityConfigurerAdapter implements CommunityConstant {
|
||||
|
||||
/**
|
||||
* 静态资源
|
||||
* @param web
|
||||
* @throws Exception
|
||||
*/
|
||||
@Override
|
||||
public void configure(WebSecurity web) throws Exception {
|
||||
web.ignoring().antMatchers("/resources/**");
|
||||
}
|
||||
|
||||
// 认证环节我们使用自己的代码 LoginController,绕过 Spring Security 的
|
||||
|
||||
/**
|
||||
* 授权
|
||||
* @param http
|
||||
* @throws Exception
|
||||
*/
|
||||
@Override
|
||||
protected void configure(HttpSecurity http) throws Exception {
|
||||
http.authorizeRequests()
|
||||
.antMatchers(
|
||||
"/user/setting",
|
||||
"/user/upload",
|
||||
"/discuss/add",
|
||||
"/comment/add",
|
||||
"/letter/**",
|
||||
"/notice/**",
|
||||
"/like",
|
||||
"/follow",
|
||||
"/unfollow"
|
||||
)
|
||||
.hasAnyAuthority(
|
||||
AUTHORITY_USER,
|
||||
AUTHORITY_ADMIN,
|
||||
AUTHORITY_MODERATOR
|
||||
)
|
||||
.anyRequest().permitAll();
|
||||
|
||||
// 权限不够时的处理
|
||||
http.exceptionHandling()
|
||||
// 1. 未登录时的处理
|
||||
.authenticationEntryPoint(new AuthenticationEntryPoint() {
|
||||
@Override
|
||||
public void commence(HttpServletRequest request, HttpServletResponse response, AuthenticationException e) throws IOException, ServletException {
|
||||
String xRequestedWith = request.getHeader("x-requested-with");
|
||||
if ("XMLHttpRequest".equals(xRequestedWith)) {
|
||||
// 异步请求
|
||||
response.setContentType("application/plain;charset=utf-8");
|
||||
PrintWriter writer = response.getWriter();
|
||||
writer.write(CommunityUtil.getJSONString(403, "你还没有登录"));
|
||||
}
|
||||
else {
|
||||
// 普通请求
|
||||
response.sendRedirect(request.getContextPath() + "/login");
|
||||
}
|
||||
}
|
||||
})
|
||||
// 2. 权限不够时的处理
|
||||
.accessDeniedHandler(new AccessDeniedHandler() {
|
||||
@Override
|
||||
public void handle(HttpServletRequest request, HttpServletResponse response, AccessDeniedException e) throws IOException, ServletException {
|
||||
String xRequestedWith = request.getHeader("x-requested-with");
|
||||
if ("XMLHttpRequest".equals(xRequestedWith)) {
|
||||
// 异步请求
|
||||
response.setContentType("application/plain;charset=utf-8");
|
||||
PrintWriter writer = response.getWriter();
|
||||
writer.write(CommunityUtil.getJSONString(403, "你没有访问该功能的权限"));
|
||||
}
|
||||
else {
|
||||
// 普通请求
|
||||
response.sendRedirect(request.getContextPath() + "/denied");
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
// Security 底层会默认拦截 /logout 请求,进行退出处理
|
||||
// 此处赋予它一个根本不存在的退出路径,使得程序能够执行到我们自己编写的退出代码
|
||||
http.logout().logoutUrl("/securitylogout");
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Security 底层会默认拦截 /logout 请求,进行退出处理。此处赋予它一个根本不存在的退出路径 /securitylogout(善意的谎言),使得程序能够执行到我们自己编写的退出代码
|
||||
|
||||
## 认证方案
|
||||
|
||||
认证环节我们之前已经写好了,就不用 Security 的了。即绕过 Spring Security 的认证流程,使用我们自己编写的代码。不在下面 Security 提供的认证方法中写:
|
||||
|
||||
```java
|
||||
@Override
|
||||
protected void configure(AuthenticationManagerBuilder auth) throws Exception {
|
||||
super.configure(auth);
|
||||
}
|
||||
```
|
||||
|
||||
**既然我们需要自己做认证,也就是将认证的结果(用户、密码、权限)存入 SecurityContext**,便于 Security 的授权:
|
||||
|
||||
我们之前利用拦截器 LoginTicketInterceptor 在所有 Controller 执行之前获取了用户信息,所以在此处做如下修改:
|
||||
|
||||

|
||||
|
||||
其中,需要在 UserService 中添加一个获取用户权限的方法:
|
||||
|
||||
```java
|
||||
/**
|
||||
* 获取某个用户的权限
|
||||
* @param userId
|
||||
* @return
|
||||
*/
|
||||
public Collection<? extends GrantedAuthority> getAuthorities(int userId) {
|
||||
User user = this.findUserById(userId);
|
||||
List<GrantedAuthority> list = new ArrayList<>();
|
||||
list.add(new GrantedAuthority() {
|
||||
@Override
|
||||
public String getAuthority() {
|
||||
switch (user.getType()) {
|
||||
case 1:
|
||||
return AUTHORITY_ADMIN;
|
||||
case 2:
|
||||
return AUTHORITY_MODERATOR;
|
||||
default:
|
||||
return AUTHORITY_USER;
|
||||
}
|
||||
}
|
||||
});
|
||||
return list;
|
||||
}
|
||||
```
|
||||
|
||||
别忘了请求结束后清除相关内容:
|
||||
|
||||

|
||||
|
||||

|
||||
|
||||
## CSRF 配置
|
||||
|
||||
禁用 csrf 的 token 检查
|
||||
|
||||

|
@ -1,177 +0,0 @@
|
||||
# 置顶、加精、删除
|
||||
|
||||
---
|
||||
|
||||

|
||||
|
||||
## 功能实现
|
||||
|
||||
service 和 DAO 就不列出来了,比较简单
|
||||
|
||||
### Controller
|
||||
|
||||
置顶、加精帖子同时也需要触发发帖事件,修改 Elasticsearch 服务器的数据
|
||||
|
||||
`DiscusspostController`:
|
||||
|
||||
```java
|
||||
/**
|
||||
* 置顶帖子
|
||||
* @param id
|
||||
* @return
|
||||
*/
|
||||
@PostMapping("/top")
|
||||
@ResponseBody
|
||||
public String setTop(int id) {
|
||||
discussPostSerivce.updateType(id, 1);
|
||||
|
||||
// 触发发帖事件,通过消息队列将其存入 Elasticsearch 服务器
|
||||
Event event = new Event()
|
||||
.setTopic(TOPIC_PUBLISH)
|
||||
.setUserId(hostHolder.getUser().getId())
|
||||
.setEntityType(ENTITY_TYPE_POST)
|
||||
.setEntityId(id);
|
||||
eventProducer.fireEvent(event);
|
||||
|
||||
return CommunityUtil.getJSONString(0);
|
||||
}
|
||||
```
|
||||
|
||||
删除帖子时将数据库中的状态修改为拉黑 2(这里我们没有写删除帖子的方法,而是将其状态设置为拉黑,重构的时候考虑一下将这个改为直接删除),并触发删帖事件, 删除 Elasticsearch 服务器中的数据
|
||||
|
||||
EventConsumer:
|
||||
|
||||
```java
|
||||
/**
|
||||
* 消费删帖事件
|
||||
*/
|
||||
@KafkaListener(topics = {TOPIC_DELETE})
|
||||
public void handleDeleteMessage(ConsumerRecord record) {
|
||||
if (record == null || record.value() == null) {
|
||||
logger.error("消息的内容为空");
|
||||
return ;
|
||||
}
|
||||
Event event = JSONObject.parseObject(record.value().toString(), Event.class);
|
||||
if (event == null) {
|
||||
logger.error("消息格式错误");
|
||||
return ;
|
||||
}
|
||||
|
||||
elasticsearchService.deleteDiscusspost(event.getEntityId());
|
||||
}
|
||||
```
|
||||
|
||||
DiscusspostController:
|
||||
|
||||
```java
|
||||
/**
|
||||
* 删除帖子
|
||||
* @param id
|
||||
* @return
|
||||
*/
|
||||
@PostMapping("/delete")
|
||||
@ResponseBody
|
||||
public String setDelete(int id) {
|
||||
discussPostSerivce.updateStatus(id, 2);
|
||||
|
||||
// 触发删帖事件,通过消息队列更新 Elasticsearch 服务器
|
||||
Event event = new Event()
|
||||
.setTopic(TOPIC_DELETE)
|
||||
.setUserId(hostHolder.getUser().getId())
|
||||
.setEntityType(ENTITY_TYPE_POST)
|
||||
.setEntityId(id);
|
||||
eventProducer.fireEvent(event);
|
||||
|
||||
return CommunityUtil.getJSONString(0);
|
||||
}
|
||||
```
|
||||
|
||||
### 前端
|
||||
|
||||
input 隐藏框的作用是方便给后端那几个方法传递参数值 post.id
|
||||
|
||||
button 的 id 是方便在 js 中进行调用
|
||||
|
||||
```html
|
||||
<input type="hidden" id="postId" th:value="${post.id}">
|
||||
<button type="button" class="btn btn-danger btn-sm" id="topBtn"
|
||||
th:disabled="${post.type == 1}" >置顶</button>
|
||||
<button type="button" class="btn btn-danger btn-sm" id="wonderfulBtn"
|
||||
th:disabled="${post.status == 1}" >加精</button>
|
||||
<button type="button" class="btn btn-danger btn-sm" id="deleteBtn"
|
||||
th:disabled="${post.status == 2}" >删除</button>
|
||||
```
|
||||
|
||||
对应的 discuss.js:(我发现只是要异步请求就需要在 js 文件中写逻辑,非异步请求不需要,不知道是否正确,后续重构的时候再看)
|
||||
|
||||
```js
|
||||
$(function(){
|
||||
$("#topBtn").click(setTop);
|
||||
$("#wonderfulBtn").click(setWonderful);
|
||||
$("#deleteBtn").click(setDelete);
|
||||
});
|
||||
|
||||
// 置顶
|
||||
function setTop() {
|
||||
$.post(
|
||||
CONTEXT_PATH + "/discuss/top", // 方法路径
|
||||
{"id":$("#postId").val()}, // 给方法传递参数
|
||||
function (data) { // 方法返回的结果
|
||||
data = $.parseJSON(data); // 方法返回的结果是 json 类型,我们将其转为 object
|
||||
if (data.code == 0) {
|
||||
// 置顶成功后,将置顶按钮设置为不可用
|
||||
$("#topBtn").attr("disabled", "disable")
|
||||
}
|
||||
else {
|
||||
alert(data.msg);
|
||||
}
|
||||
}
|
||||
)
|
||||
}
|
||||
|
||||
// 加精(和置顶差不多)
|
||||
|
||||
// 删除
|
||||
function setDelete() {
|
||||
$.post(
|
||||
CONTEXT_PATH + "/discuss/delete",
|
||||
{"id":$("#postId").val()},
|
||||
function (data) {
|
||||
data = $.parseJSON(data);
|
||||
if (data.code == 0) {
|
||||
// 删除成功后,跳转到首页
|
||||
location.href = CONTEXT_PATH + "/index";
|
||||
}
|
||||
else {
|
||||
alert(data.msg);
|
||||
}
|
||||
}
|
||||
)
|
||||
}
|
||||
```
|
||||
|
||||
## 权限管理
|
||||
|
||||
在 SecurityConfig 中加上这么一段即可:
|
||||
|
||||

|
||||
|
||||
## 按钮显示
|
||||
|
||||
不同的权限显示不同的按钮这个功能,需要通过 Thymeleaf SpringSecurity5 实现,导入依赖:
|
||||
|
||||
```xml
|
||||
<!--thymeleaf springsecurity5-->
|
||||
<dependency>
|
||||
<groupId>org.thymeleaf.extras</groupId>
|
||||
<artifactId>thymeleaf-extras-springsecurity5</artifactId>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
在文件头添加引用 `xmlns:sec="http://www.thymeleaf.org/extra/spring-security"`:
|
||||
|
||||
```html
|
||||
<html lang="en" xmlns:th="http://www.thymeleaf.org" xmlns:sec="http://www.thymeleaf.org/extra/spring-security">
|
||||
```
|
||||
|
||||

|
@ -1,271 +0,0 @@
|
||||
# 网站数据统计
|
||||
|
||||
---
|
||||
|
||||
使用 Redis 高级数据类型:
|
||||
|
||||
<img src="https://gitee.com/veal98/images/raw/master/img/20210131104903.png" style="zoom: 33%;" />
|
||||
|
||||

|
||||
|
||||
DAU 只统计登录用户,UV 兼并访客和登录用户
|
||||
|
||||
DAU:将用户 ID 作为 bitmap 的 key 存入,若该用户当天访问过,则置 value 为 1。一天用一个 bitmap
|
||||
|
||||
计算区间内的 DAU 的时候, 使用 or 运算,就是说这么多天只要这个用户登陆过一次,就算活跃用户
|
||||
|
||||
RedisKeyUtil:
|
||||
|
||||
```java
|
||||
/**
|
||||
* 单日 UV
|
||||
* @param date
|
||||
* @return
|
||||
*/
|
||||
public static String getUVKey(String date) {
|
||||
return PREFIX_UV + SPLIT + date;
|
||||
}
|
||||
|
||||
/**
|
||||
* 区间 UV
|
||||
* @param startDate
|
||||
* @param endDate
|
||||
* @return
|
||||
*/
|
||||
public static String getUVKey(String startDate, String endDate) {
|
||||
return PREFIX_UV + SPLIT + startDate + SPLIT + endDate;
|
||||
}
|
||||
|
||||
/**
|
||||
* 单日 DAU
|
||||
* @param date
|
||||
* @return
|
||||
*/
|
||||
public static String getDAUKey(String date) {
|
||||
return PREFIX_DAU + SPLIT + date;
|
||||
}
|
||||
|
||||
/**
|
||||
* 区间 DAU
|
||||
* @param startDate
|
||||
* @param endDate
|
||||
* @return
|
||||
*/
|
||||
public static String getDAUKey(String startDate, String endDate) {
|
||||
return PREFIX_DAU + SPLIT + startDate + SPLIT + endDate;
|
||||
}
|
||||
```
|
||||
|
||||
## Service
|
||||
|
||||
```java
|
||||
/**
|
||||
* 网站数据统计(UV / DAU)
|
||||
*/
|
||||
@Service
|
||||
public class DataService {
|
||||
|
||||
@Autowired
|
||||
private RedisTemplate redisTemplate;
|
||||
|
||||
private SimpleDateFormat df = new SimpleDateFormat("yyyy-MM-dd");
|
||||
|
||||
/**
|
||||
* 将指定的 IP 计入当天的 UV
|
||||
* @param ip
|
||||
*/
|
||||
public void recordUV(String ip) {
|
||||
String redisKey = RedisKeyUtil.getUVKey(df.format(new Date()));
|
||||
redisTemplate.opsForHyperLogLog().add(redisKey, ip);
|
||||
}
|
||||
|
||||
/**
|
||||
* 统计指定日期范围内的 UV
|
||||
* @param start
|
||||
* @param end
|
||||
* @return
|
||||
*/
|
||||
public long calculateUV(Date start, Date end) {
|
||||
if (start == null || end == null) {
|
||||
throw new IllegalArgumentException("参数不能为空");
|
||||
}
|
||||
|
||||
// 整理该日期范围内的 key
|
||||
List<String> keyList = new ArrayList<>();
|
||||
Calendar calendar = Calendar.getInstance();
|
||||
calendar.setTime(start);
|
||||
while (!calendar.getTime().after(end)) {
|
||||
String key = RedisKeyUtil.getUVKey(df.format(calendar.getTime()));
|
||||
keyList.add(key);
|
||||
calendar.add(Calendar.DATE, 1); // 加1天
|
||||
}
|
||||
|
||||
// 合并这些天的 UV
|
||||
String redisKey = RedisKeyUtil.getUVKey(df.format(start), df.format(end));
|
||||
redisTemplate.opsForHyperLogLog().union(redisKey, keyList.toArray());
|
||||
|
||||
// 返回统计结果
|
||||
return redisTemplate.opsForHyperLogLog().size(redisKey);
|
||||
}
|
||||
|
||||
/**
|
||||
* 将指定的 IP 计入当天的 DAU
|
||||
* @param userId
|
||||
*/
|
||||
public void recordDAU(int userId) {
|
||||
String redisKey = RedisKeyUtil.getDAUKey(df.format(new Date()));
|
||||
redisTemplate.opsForValue().setBit(redisKey, userId, true);
|
||||
}
|
||||
|
||||
/**
|
||||
* 统计指定日期范围内的 DAU
|
||||
* @param start
|
||||
* @param end
|
||||
* @return
|
||||
*/
|
||||
public long calculateDAU(Date start, Date end) {
|
||||
if (start == null || end == null) {
|
||||
throw new IllegalArgumentException("参数不能为空");
|
||||
}
|
||||
|
||||
// 整理该日期范围内的 key
|
||||
List<byte[]> keyList = new ArrayList<>();
|
||||
Calendar calendar = Calendar.getInstance();
|
||||
calendar.setTime(start);
|
||||
while (!calendar.getTime().after(end)) {
|
||||
String key = RedisKeyUtil.getDAUKey(df.format(calendar.getTime()));
|
||||
keyList.add(key.getBytes());
|
||||
calendar.add(Calendar.DATE, 1); // 加1天
|
||||
}
|
||||
|
||||
// 进行 or 运算
|
||||
return (long) redisTemplate.execute(new RedisCallback() {
|
||||
@Override
|
||||
public Object doInRedis(RedisConnection redisConnection) throws DataAccessException {
|
||||
String redisKey = RedisKeyUtil.getDAUKey(df.format(start), df.format(end));
|
||||
redisConnection.bitOp(RedisStringCommands.BitOperation.OR,
|
||||
redisKey.getBytes(), keyList.toArray(new byte[0][0]));
|
||||
return redisConnection.bitCount(redisKey.getBytes());
|
||||
}
|
||||
});
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
|
||||
|
||||
## 拦截器
|
||||
|
||||
在所有请求执行之前查询 uv 和 dau
|
||||
|
||||
```java
|
||||
@Component
|
||||
public class DataInterceptor implements HandlerInterceptor {
|
||||
|
||||
@Autowired
|
||||
private DataService dataService;
|
||||
|
||||
@Autowired
|
||||
private HostHolder hostHolder;
|
||||
|
||||
@Override
|
||||
public boolean preHandle(HttpServletRequest request, HttpServletResponse response, Object handler) throws Exception {
|
||||
// 统计 UV
|
||||
String ip = request.getRemoteHost();
|
||||
dataService.recordUV(ip);
|
||||
|
||||
// 统计 DAU
|
||||
User user = hostHolder.getUser();
|
||||
if (user != null) {
|
||||
dataService.recordDAU(user.getId());
|
||||
}
|
||||
|
||||
return true;
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
别忘记在配置中添加此拦截器
|
||||
|
||||

|
||||
|
||||
## Controller
|
||||
|
||||
```java
|
||||
/**
|
||||
* 网站数据
|
||||
*/
|
||||
@Controller
|
||||
public class DataController {
|
||||
|
||||
@Autowired
|
||||
private DataService dataService;
|
||||
|
||||
/**
|
||||
* 进入统计界面
|
||||
* @return
|
||||
*/
|
||||
@RequestMapping(value = "/data", method = {RequestMethod.GET, RequestMethod.POST})
|
||||
public String getDataPage() {
|
||||
return "/site/admin/data";
|
||||
}
|
||||
|
||||
/**
|
||||
* 统计网站 uv
|
||||
* @param start
|
||||
* @param end
|
||||
* @return
|
||||
*/
|
||||
@PostMapping("/data/uv")
|
||||
public String getUV(@DateTimeFormat(pattern = "yyyy-MM-dd") Date start,
|
||||
@DateTimeFormat(pattern = "yyyy-MM-dd") Date end,
|
||||
Model model) {
|
||||
long uv = dataService.calculateUV(start, end);
|
||||
model.addAttribute("uvResult", uv);
|
||||
model.addAttribute("uvStartDate", start);
|
||||
model.addAttribute("uvEndDate", end);
|
||||
return "forward:/data";
|
||||
}
|
||||
|
||||
/**
|
||||
* 统计网站 DAU
|
||||
* @param start
|
||||
* @param end
|
||||
* @return
|
||||
*/
|
||||
@PostMapping("/data/dau")
|
||||
public String getDAU(@DateTimeFormat(pattern = "yyyy-MM-dd") Date start,
|
||||
@DateTimeFormat(pattern = "yyyy-MM-dd") Date end,
|
||||
Model model) {
|
||||
long dau = dataService.calculateDAU(start, end);
|
||||
model.addAttribute("dauResult", dau);
|
||||
model.addAttribute("dauStartDate", start);
|
||||
model.addAttribute("dauEndDate", end);
|
||||
return "forward:/data";
|
||||
}
|
||||
|
||||
}
|
||||
```
|
||||
|
||||

|
||||
|
||||
|
||||
|
||||
别忘记给管理员赋予权限:
|
||||
|
||||

|
||||
|
||||
## 前端
|
||||
|
||||
data.html
|
||||
|
||||
```html
|
||||
<form method="post" th:action="@{/data/dau}">
|
||||
<input type="date"name="start"
|
||||
th:value="${#dates.format(dauStartDate, 'yyyy-MM-dd')}" />
|
||||
<input type="date" required name="end"
|
||||
th:value="${#dates.format(dauEndDate, 'yyyy-MM-dd')}" />
|
||||
<button type="submit">开始统计</button>
|
||||
</form>
|
||||
统计结果 <span th:text="${dauResult}">
|
||||
```
|
234
docs/290-热帖排行.md
234
docs/290-热帖排行.md
@ -1,234 +0,0 @@
|
||||
# 热帖排行
|
||||
|
||||
---
|
||||
|
||||
每隔一段时间就计算帖子的分数,需要使用分布式定时任务:
|
||||
|
||||

|
||||
|
||||

|
||||
|
||||
Spring Quartz 导包
|
||||
|
||||
```xml
|
||||
<!--Spring Quartz-->
|
||||
<dependency>
|
||||
<groupId>org.springframework.boot</groupId>
|
||||
<artifactId>spring-boot-starter-quartz</artifactId>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
由于 Spring Quartz 依赖于数据库,所以我们需要提前在数据库中创建 Quartz 需要的表
|
||||
|
||||
运行 init_quartz.sql 文件
|
||||
|
||||
<img src="https://gitee.com/veal98/images/raw/master/img/20210131155131.png" style="zoom: 67%;" />
|
||||
|
||||
分数计算设计:
|
||||
|
||||

|
||||
|
||||
|
||||
|
||||
## 分数计算
|
||||
|
||||
每次发生点赞(给帖子点赞)、评论(给帖子评论)、加精的时候,就将这些帖子存入缓存 Redis 中,然后通过分布式的定时任务,每隔一段时间就从缓存中取出这些帖子进行计算分数。
|
||||
|
||||
RedisKeyUtil
|
||||
|
||||
```java
|
||||
/**
|
||||
* 帖子分数
|
||||
* @return
|
||||
*/
|
||||
public static String getPostScoreKey() {
|
||||
return PREFIX_POST + SPLIT + "score";
|
||||
}
|
||||
```
|
||||
|
||||
### Controller
|
||||
|
||||
以点赞操作为例:
|
||||
|
||||
<img src="https://gitee.com/veal98/images/raw/master/img/20210131173755.png" style="zoom: 50%;" />
|
||||
|
||||
### Job
|
||||
|
||||
```java
|
||||
/**
|
||||
* 帖子分数计算刷新
|
||||
*/
|
||||
public class PostScoreRefreshJob implements Job, CommunityConstant {
|
||||
|
||||
private static final Logger logger = LoggerFactory.getLogger(PostScoreRefreshJob.class);
|
||||
|
||||
@Autowired
|
||||
private RedisTemplate redisTemplate;
|
||||
|
||||
@Autowired
|
||||
private DiscussPostSerivce discussPostSerivce;
|
||||
|
||||
@Autowired
|
||||
private LikeService likeService;
|
||||
|
||||
@Autowired
|
||||
private ElasticsearchService elasticsearchService;
|
||||
|
||||
// Epoch 纪元
|
||||
private static final Date epoch;
|
||||
|
||||
static {
|
||||
try {
|
||||
epoch = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss").parse("2014-01-01 00:00:00");
|
||||
} catch (ParseException e) {
|
||||
throw new RuntimeException("初始化 Epoch 纪元失败", e);
|
||||
}
|
||||
}
|
||||
|
||||
@Override
|
||||
public void execute(JobExecutionContext jobExecutionContext) throws JobExecutionException {
|
||||
String redisKey = RedisKeyUtil.getPostScoreKey();
|
||||
BoundSetOperations operations = redisTemplate.boundSetOps(redisKey);
|
||||
|
||||
if (operations.size() == 0) {
|
||||
logger.info("[任务取消] 没有需要刷新的帖子");
|
||||
return ;
|
||||
}
|
||||
|
||||
logger.info("[任务开始] 正在刷新帖子分数: " + operations.size());
|
||||
while (operations.size() > 0) {
|
||||
this.refresh((Integer) operations.pop());
|
||||
}
|
||||
logger.info("[任务结束] 帖子分数刷新完毕");
|
||||
}
|
||||
|
||||
/**
|
||||
* 刷新帖子分数
|
||||
* @param postId
|
||||
*/
|
||||
private void refresh(int postId) {
|
||||
DiscussPost post = discussPostSerivce.findDiscussPostById(postId);
|
||||
|
||||
if (post == null) {
|
||||
logger.error("该帖子不存在: id = " + postId);
|
||||
return ;
|
||||
}
|
||||
|
||||
// 是否加精
|
||||
boolean wonderful = post.getStatus() == 1;
|
||||
// 评论数量
|
||||
int commentCount = post.getCommentCount();
|
||||
// 点赞数量
|
||||
long likeCount = likeService.findEntityLikeCount(ENTITY_TYPE_POST, postId);
|
||||
|
||||
// 计算权重
|
||||
double w = (wonderful ? 75 : 0) + commentCount * 10 + likeCount * 2;
|
||||
// 分数 = 权重 + 发帖距离天数
|
||||
double score = Math.log10(Math.max(w, 1))
|
||||
+ (post.getCreateTime().getTime() - epoch.getTime()) / (1000 * 3600 * 24);
|
||||
// 更新帖子分数
|
||||
discussPostSerivce.updateScore(postId, score);
|
||||
// 同步更新搜索数据
|
||||
post.setScore(score);
|
||||
elasticsearchService.saveDiscusspost(post);
|
||||
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
注意同步更新一下搜索 Elasticsearch 服务器的数据
|
||||
|
||||
### Quarz Config
|
||||
|
||||
```java
|
||||
/**
|
||||
* Spring Quartz 配置类,用于将数据存入数据库,以后直接从数据库中调用数据
|
||||
*/
|
||||
@Configuration
|
||||
public class QuartzConfig {
|
||||
|
||||
/**
|
||||
* 刷新帖子分数任务
|
||||
* @return
|
||||
*/
|
||||
@Bean
|
||||
public JobDetailFactoryBean postScoreRefreshJobDetail() {
|
||||
JobDetailFactoryBean factoryBean = new JobDetailFactoryBean();
|
||||
factoryBean.setJobClass(PostScoreRefreshJob.class);
|
||||
factoryBean.setName("postScoreRefreshJob");
|
||||
factoryBean.setGroup("communityJobGroup");
|
||||
factoryBean.setDurability(true);
|
||||
factoryBean.setRequestsRecovery(true);
|
||||
return factoryBean;
|
||||
}
|
||||
|
||||
/**
|
||||
* 刷新帖子分数触发器
|
||||
* @return
|
||||
*/
|
||||
@Bean
|
||||
public SimpleTriggerFactoryBean postScoreRefreshTrigger(JobDetail postScoreRefreshJobDetail) {
|
||||
SimpleTriggerFactoryBean factoryBean = new SimpleTriggerFactoryBean();
|
||||
factoryBean.setJobDetail(postScoreRefreshJobDetail);
|
||||
factoryBean.setName("postScoreRefreshTrigger");
|
||||
factoryBean.setGroup("communityTriggerGroup");
|
||||
factoryBean.setRepeatInterval(1000 * 60 * 5); // 5分钟刷新一次
|
||||
factoryBean.setJobDataMap(new JobDataMap());
|
||||
return factoryBean;
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## 根据分数进行排行显示
|
||||
|
||||
重构下分页查询帖子的方法:
|
||||
|
||||
```java
|
||||
/**
|
||||
* 分页查询讨论帖信息
|
||||
*
|
||||
* @param userId 当传入的 userId = 0 时查找所有用户的帖子
|
||||
* 当传入的 userId != 0 时,查找该指定用户的帖子
|
||||
* @param offset 每页的起始索引
|
||||
* @param limit 每页显示多少条数据
|
||||
* @param orderMode 排行模式(若传入 1, 则按照热度来排序)
|
||||
* @return
|
||||
*/
|
||||
List<DiscussPost> selectDiscussPosts(int userId, int offset, int limit, int orderMode);
|
||||
```
|
||||
|
||||
添加一个 orderMode,若传入 1, 则按照热度来排序,传入 0(默认)则按照最新来排序。当然,置顶的帖子不受影响
|
||||
|
||||
对应的 service 也需要做相应修改
|
||||
|
||||
```java
|
||||
/**
|
||||
* 分页查询讨论帖信息
|
||||
*
|
||||
* @param userId 当传入的 userId = 0 时查找所有用户的帖子
|
||||
* 当传入的 userId != 0 时,查找该指定用户的帖子
|
||||
* @param offset 每页的起始索引
|
||||
* @param limit 每页显示多少条数据
|
||||
* @param orderMode 排行模式(若传入 1, 则按照热度来排序)
|
||||
* @return
|
||||
*/
|
||||
public List<DiscussPost> findDiscussPosts (int userId, int offset, int limit, int orderMode) {
|
||||
return discussPostMapper.selectDiscussPosts(userId, offset, limit, orderMode);
|
||||
}
|
||||
```
|
||||
|
||||
在修改对应的 HomeController
|
||||
|
||||

|
||||
|
||||
Get 方法:前端是通过 ? 来传递参数的:比如说`/index?1`,前端 `th:href="@{/index(orderMode=0)}"`
|
||||
|
||||
而通过请求体来传递参数使用的是 Post 请求 :比如说`/add/postid`,前端 `th:href="@{|/index/${post.id}|}"`
|
||||
|
||||
修改一下 index.html
|
||||
|
||||
```html
|
||||
<a th:class="|nav-link ${orderMode==0 ? 'active' : ''}|" th:href="@{/index(orderMode=0)}"><i class="bi bi-lightning"></i> 最新</a>
|
||||
|
||||
<a th:class="|nav-link ${orderMode==1 ? 'active' : ''}|" th:href="@{/index(orderMode=1)}"><i class="bi bi-hand-thumbs-up"></i> 最热</a>
|
||||
```
|
@ -1,260 +0,0 @@
|
||||
# 🛒 开发注册功能
|
||||
|
||||
---
|
||||
|
||||
## 1. 开发步骤
|
||||
|
||||
访问注册页面
|
||||
|
||||
提交注册数据:
|
||||
|
||||
- 通过表单提交数据
|
||||
- 服务端验证账号是否已存在、邮箱是否已注册
|
||||
- 服务端发送激活邮件
|
||||
|
||||
激活注册账号:
|
||||
|
||||
- 点击邮件中的链接,访问服务端的激活服务
|
||||
|
||||
## 2. 代码编写
|
||||
|
||||
### ① 提交注册数据
|
||||
|
||||
#### UserService
|
||||
|
||||
```java
|
||||
package com.greate.community.service;
|
||||
|
||||
import com.greate.community.dao.UserMapper;
|
||||
import com.greate.community.entity.User;
|
||||
import com.greate.community.util.CommunityUtil;
|
||||
import com.greate.community.util.MailClient;
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.beans.factory.annotation.Value;
|
||||
import org.springframework.stereotype.Service;
|
||||
import org.thymeleaf.TemplateEngine;
|
||||
import org.thymeleaf.context.Context;
|
||||
|
||||
import java.util.Date;
|
||||
import java.util.HashMap;
|
||||
import java.util.Map;
|
||||
import java.util.Random;
|
||||
|
||||
@Service
|
||||
public class UserService {
|
||||
|
||||
@Autowired
|
||||
private UserMapper userMapper;
|
||||
|
||||
@Autowired
|
||||
private MailClient mailClient;
|
||||
|
||||
@Autowired
|
||||
private TemplateEngine templateEngine;
|
||||
|
||||
// 网站域名
|
||||
@Value("${community.path.domain}")
|
||||
private String domain;
|
||||
|
||||
// 项目名 http://localhost:8080/greatecommunity/......
|
||||
@Value("${server.servlet.context-path}")
|
||||
private String contextPath;
|
||||
|
||||
/**
|
||||
* 根据 Id 查询用户
|
||||
* @param id
|
||||
* @return
|
||||
*/
|
||||
public User findUserById (int id) {
|
||||
return userMapper.selectById(id);
|
||||
}
|
||||
|
||||
/**
|
||||
* 用户注册
|
||||
* @param user
|
||||
* @return Map<String, Object> 返回错误提示消息,如果返回的 map 为空,则说明注册成功
|
||||
*/
|
||||
public Map<String, Object> register(User user) {
|
||||
Map<String, Object> map = new HashMap<>();
|
||||
|
||||
if (user == null) {
|
||||
throw new IllegalArgumentException("参数不能为空");
|
||||
}
|
||||
if (StringUtils.isBlank(user.getUsername())) {
|
||||
map.put("usernameMsg", "账号不能为空");
|
||||
return map;
|
||||
}
|
||||
|
||||
if (StringUtils.isBlank(user.getPassword())) {
|
||||
map.put("passwordMsg", "密码不能为空");
|
||||
return map;
|
||||
}
|
||||
|
||||
if (StringUtils.isBlank(user.getEmail())) {
|
||||
map.put("emailMsg", "邮箱不能为空");
|
||||
return map;
|
||||
}
|
||||
|
||||
// 验证账号是否已存在
|
||||
User u = userMapper.selectByName(user.getUsername());
|
||||
if (u != null) {
|
||||
map.put("usernameMsg", "该账号已存在");
|
||||
return map;
|
||||
}
|
||||
|
||||
// 验证邮箱是否已存在
|
||||
u = userMapper.selectByEmail(user.getEmail());
|
||||
if (u != null) {
|
||||
map.put("emailMsg", "该邮箱已被注册");
|
||||
return map;
|
||||
}
|
||||
|
||||
// 注册用户
|
||||
user.setSalt(CommunityUtil.generateUUID().substring(0, 5)); // salt
|
||||
user.setPassword(CommunityUtil.md5(user.getPassword() + user.getSalt())); // 加盐加密
|
||||
user.setType(0); // 默认普通用户
|
||||
user.setStatus(0); // 默认未激活
|
||||
user.setActivationCode(CommunityUtil.generateUUID()); // 激活码
|
||||
// 随机头像(用户登录后可以自行修改)
|
||||
user.setHeaderUrl(String.format("http://images/nowcoder.com/head/%dt.png", new Random().nextInt(1000)));
|
||||
user.setCreateTime(new Date()); // 注册时间
|
||||
userMapper.insertUser(user);
|
||||
|
||||
// 给注册用户发送激活邮件
|
||||
Context context = new Context();
|
||||
context.setVariable("email", user.getEmail());
|
||||
// http://localhost:8080/greatecommunity/activation/用户id/激活码
|
||||
String url = domain + contextPath + "/activation" + user.getId() + "/" + user.getActivationCode();
|
||||
context.setVariable("url", url);
|
||||
String content = templateEngine.process("/mail/activation", context);
|
||||
mailClient.sendMail(user.getEmail(),"激活 Greate Community 账号", content);
|
||||
|
||||
return map;
|
||||
}
|
||||
|
||||
}
|
||||
```
|
||||
|
||||
#### LoginController
|
||||
|
||||
```java
|
||||
/**
|
||||
* 登录注册
|
||||
*/
|
||||
@Controller
|
||||
public class LoginController {
|
||||
|
||||
@Autowired
|
||||
UserService userService;
|
||||
|
||||
/**
|
||||
* 进入注册界面
|
||||
* @return
|
||||
*/
|
||||
@GetMapping("/register")
|
||||
public String getRegisterPage() {
|
||||
return "site/register";
|
||||
}
|
||||
|
||||
/**
|
||||
* 注册用户
|
||||
* @param model
|
||||
* @param user
|
||||
* @return
|
||||
*/
|
||||
@PostMapping("/register")
|
||||
public String register(Model model, User user) {
|
||||
Map<String, Object> map = userService.register(user);
|
||||
if (map == null || map.isEmpty()) {
|
||||
model.addAttribute("msg", "注册成功, 我们已经向您的邮箱发送了一封激活邮件,请尽快激活!");
|
||||
model.addAttribute("target", "/index");
|
||||
return "/site/operate-result";
|
||||
} else {
|
||||
model.addAttribute("usernameMsg", map.get("usernameMsg"));
|
||||
model.addAttribute("passwordMsg", map.get("passwordMsg"));
|
||||
model.addAttribute("emailMsg", map.get("emailMsg"));
|
||||
return "/site/register";
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
```
|
||||
|
||||
注册成功会跳转到一个中间界面:`operate-result`,提示用户注册成功,并前往邮箱进行激活
|
||||
|
||||
```html
|
||||
<p>
|
||||
系统会在 <span id="seconds" class="text-danger">8</span> 秒后自动跳转,
|
||||
您也可以点击 <a id="target" th:href="@{${target}}" class="text-primary">此链接</a>, 手动跳转!
|
||||
</p>
|
||||
```
|
||||
|
||||
#### 前端通过表单提交数据
|
||||
|
||||
```html
|
||||
<form class="mt-5" method="post" th:action="@{/register}">
|
||||
<input name = "username" />
|
||||
```
|
||||
|
||||
SpringMVC 基于同名原则,把前端的 username 传给 user.username
|
||||
|
||||
```java
|
||||
@PostMapping("/register")
|
||||
public String register(Model model, User user)
|
||||
```
|
||||
|
||||
### ② 激活注册账号
|
||||
|
||||
#### UserService
|
||||
|
||||
```java
|
||||
/**
|
||||
* 激活用户
|
||||
* @param userId 用户 id
|
||||
* @param code 激活码
|
||||
* @return
|
||||
*/
|
||||
public int activation(int userId, String code) {
|
||||
User user = userMapper.selectById(userId);
|
||||
if (user.getStatus() == 1) {
|
||||
// 用户已激活
|
||||
return ACTIVATION_REPEAT;
|
||||
}
|
||||
else if (user.getActivationCode().equals(code)) {
|
||||
// 修改用户状态为已激活
|
||||
userMapper.updateStatus(userId, 1);
|
||||
return ACTIVATION_SUCCESS;
|
||||
}
|
||||
else {
|
||||
return ACTIVATION_FAILURE;
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
#### LoginController
|
||||
|
||||
用户收到激活邮件后,点击邮件中的链接,则激活该用户
|
||||
|
||||
```java
|
||||
// 用户激活
|
||||
// http://localhost:8080/greatecommunity/activation/用户id/激活码
|
||||
@GetMapping("/activation/{userId}/{code}")
|
||||
public String activation(Model model, @PathVariable("userId") int userId, @PathVariable("code") String code) {
|
||||
int result = userService.activation(userId, code);
|
||||
if (result == ACTIVATION_SUCCESS) {
|
||||
model.addAttribute("msg", "激活成功, 您的账号已经可以正常使用!");
|
||||
model.addAttribute("target", "/login");
|
||||
}
|
||||
else if (result == ACTIVATION_REPEAT) {
|
||||
model.addAttribute("msg", "无效的操作, 您的账号已被激活过!");
|
||||
model.addAttribute("target", "/index");
|
||||
}
|
||||
else {
|
||||
model.addAttribute("msg", "激活失败, 您提供的激活码不正确!");
|
||||
model.addAttribute("target", "/index");
|
||||
}
|
||||
return "/site/operate-result";
|
||||
}
|
||||
```
|
||||
|
@ -1,76 +0,0 @@
|
||||
# 生成验证码
|
||||
|
||||
---
|
||||
|
||||
## 导入 Kaptcha 依赖
|
||||
|
||||
## 配置 Kaptcha
|
||||
|
||||
```java
|
||||
@Configuration
|
||||
public class KaptchaConfig {
|
||||
|
||||
@Bean
|
||||
public Producer kaptchaProducer() {
|
||||
Properties properties = new Properties();
|
||||
properties.setProperty("kaptcha.image.width", "100");
|
||||
properties.setProperty("kaptcha.image.height", "40");
|
||||
properties.setProperty("kaptcha.textproducer.font.size", "32");
|
||||
properties.setProperty("kaptcha.textproducer.font.color", "black");
|
||||
// 随机生成字符的范围
|
||||
properties.setProperty("kaptcha.textproducer.char.string", "0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZ");
|
||||
// 生成几个字符
|
||||
properties.setProperty("kaptcha.textproducer.char.length", "4");
|
||||
// 添加噪声
|
||||
properties.setProperty("kaptcha.textproducer.noise.impl", "com.google.code.kaptcha.impl.NoNoise");
|
||||
|
||||
DefaultKaptcha kaptcha = new DefaultKaptcha();
|
||||
Config config = new Config(properties);
|
||||
kaptcha.setConfig(config);
|
||||
return kaptcha;
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
```
|
||||
|
||||
## 生成随机字符、生成图片
|
||||
|
||||
|
||||
|
||||
```java
|
||||
@GetMapping("/kaptcha")
|
||||
public void getKaptcha(HttpServletResponse response, HttpSession session) {
|
||||
// 生成验证码
|
||||
String text = kaptchaProducer.createText(); // 生成随机字符
|
||||
BufferedImage image = kaptchaProducer.createImage(text); // 生成图片
|
||||
|
||||
// 将验证码存入 session
|
||||
session.setAttribute("kaptcha", text);
|
||||
|
||||
// 将图片输出给浏览器
|
||||
response.setContentType("image/png");
|
||||
try {
|
||||
ServletOutputStream os = response.getOutputStream();
|
||||
ImageIO.write(image, "png", os);
|
||||
} catch (IOException e) {
|
||||
logger.error("响应验证码失败", e.getMessage());
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
|
||||
|
||||
```js
|
||||
<img th:src="@{/kaptcha}" id = "kaptcha"/>
|
||||
<a href="javascript:refresh_kaptcha();">刷新验证码</a>
|
||||
|
||||
<script>
|
||||
function refresh_kaptcha() {
|
||||
var path = CONTEXT_PATH + "/kaptcha?p=" + Math.random();
|
||||
$("#kaptcha").attr("src", path);
|
||||
}
|
||||
</script>
|
||||
```
|
||||
|
||||
浏览器看到路径变了就会自动请求服务器,即请求 getKaptcha 方法。这个 p 参数只是我们用来改变 url 的(因为有些浏览器比较智能,看见 url 没变就不会重新请求服务器),并不需要在服务端使用它
|
@ -1,225 +0,0 @@
|
||||
# 开发登录、登出功能
|
||||
|
||||
---
|
||||
|
||||
登录:
|
||||
|
||||
- 验证账号、密码、验证码
|
||||
- 成功时,生成登录凭证并将其状态设置为有效,发放给客户端(为什么不用 session,而是存放在数据库表中?因为 session 会给服务器增加压力,且不利于分布式部署)
|
||||
- 失败时,跳转回登录页
|
||||
|
||||
登出:
|
||||
|
||||
- 将登录凭证修改为失效状态
|
||||
- 跳转至网站首页
|
||||
|
||||
|
||||
|
||||
## DAO
|
||||
|
||||
```java
|
||||
@Mapper
|
||||
public interface LoginTicketMapper {
|
||||
|
||||
/**
|
||||
* 插入一条 LoginTicket
|
||||
* @param loginTicket
|
||||
* @return
|
||||
*/
|
||||
int insertLoginTicket(LoginTicket loginTicket);
|
||||
|
||||
/**
|
||||
* 根据 ticket 查询 LoginTicket
|
||||
* @param ticket
|
||||
* @return
|
||||
*/
|
||||
LoginTicket selectByTicket(String ticket);
|
||||
|
||||
/**
|
||||
* 更新凭证状态,0:有效,1:无效
|
||||
* @param ticket
|
||||
* @param status
|
||||
* @return
|
||||
*/
|
||||
int updateStatus(String ticket, int status);
|
||||
|
||||
}
|
||||
```
|
||||
|
||||
对应的 `mapper.xml`
|
||||
|
||||
```xml
|
||||
<sql id = "insertFields">
|
||||
user_id, ticket, status, expired
|
||||
</sql>
|
||||
|
||||
<sql id = "selectFields">
|
||||
id, user_id, ticket, status, expired
|
||||
</sql>
|
||||
|
||||
<!--根据 ticket 查询凭证信息-->
|
||||
<select id="selectByTicket" resultType="LoginTicket">
|
||||
select <include refid="selectFields"></include>
|
||||
from login_ticket
|
||||
where ticket = #{ticket}
|
||||
</select>
|
||||
|
||||
<!--插入凭证信息-->
|
||||
<insert id="insertLoginTicket" parameterType="LoginTicket" keyProperty="id">
|
||||
insert into login_ticket (<include refid="insertFields"></include>)
|
||||
values(#{userId}, #{ticket}, #{status}, #{expired})
|
||||
</insert>
|
||||
|
||||
<!--根据 ticket 修改凭证状态-->
|
||||
<update id="updateStatus">
|
||||
update login_ticket set status = #{status} where ticket = #{ticket}
|
||||
</update>
|
||||
```
|
||||
|
||||
## Service
|
||||
|
||||
```java
|
||||
/**
|
||||
* 用户登录(为用户创建凭证)
|
||||
* @param username
|
||||
* @param password
|
||||
* @param expiredSeconds 多少秒后凭证过期
|
||||
* @return Map<String, Object> 返回错误提示消息以及 ticket(凭证)
|
||||
*/
|
||||
public Map<String, Object> login(String username, String password, int expiredSeconds) {
|
||||
Map<String, Object> map = new HashMap<>();
|
||||
|
||||
// 空值处理
|
||||
if (StringUtils.isBlank(username)) {
|
||||
map.put("usernameMsg", "账号不能为空");
|
||||
return map;
|
||||
}
|
||||
if (StringUtils.isBlank(password)) {
|
||||
map.put("passwordMsg", "密码不能为空");
|
||||
return map;
|
||||
}
|
||||
|
||||
// 验证账号
|
||||
User user = userMapper.selectByName(username);
|
||||
if (user == null) {
|
||||
map.put("usernameMsg", "该账号不存在");
|
||||
return map;
|
||||
}
|
||||
|
||||
// 验证状态
|
||||
if (user.getStatus() == 0) {
|
||||
// 账号未激活
|
||||
map.put("usernameMsg", "该账号未激活");
|
||||
return map;
|
||||
}
|
||||
|
||||
// 验证密码
|
||||
password = CommunityUtil.md5(password + user.getSalt());
|
||||
if (!user.getPassword().equals(password)) {
|
||||
map.put("passwordMsg", "密码错误");
|
||||
return map;
|
||||
}
|
||||
|
||||
// 用户名和密码均正确,为该用户生成登录凭证
|
||||
LoginTicket loginTicket = new LoginTicket();
|
||||
loginTicket.setUserId(user.getId());
|
||||
loginTicket.setTicket(CommunityUtil.generateUUID()); // 随机凭证
|
||||
loginTicket.setStatus(0); // 设置凭证状态为有效(当用户登出的时候,设置凭证状态为无效)
|
||||
loginTicket.setExpired(new Date(System.currentTimeMillis() + expiredSeconds * 1000)); // 设置凭证到期时间
|
||||
|
||||
loginTicketMapper.insertLoginTicket(loginTicket);
|
||||
|
||||
map.put("ticket", loginTicket.getTicket());
|
||||
|
||||
return map;
|
||||
}
|
||||
|
||||
/**
|
||||
* 用户退出(将凭证状态设为无效)
|
||||
* @param ticket
|
||||
*/
|
||||
public void logout(String ticket) {
|
||||
loginTicketMapper.updateStatus(ticket, 1);
|
||||
}
|
||||
```
|
||||
|
||||
|
||||
|
||||
## Controller
|
||||
|
||||
```java
|
||||
/**
|
||||
* 用户登录
|
||||
* @param username 用户名
|
||||
* @param password 密码
|
||||
* @param code 验证码
|
||||
* @param rememberMe 是否记住我(点击记住我后,凭证的有效期延长)
|
||||
* @param model
|
||||
* @param session 从 session 中取出验证码
|
||||
* @param response
|
||||
* @return
|
||||
*/
|
||||
@PostMapping("/login")
|
||||
public String login(@RequestParam("username") String username,
|
||||
@RequestParam("password") String password,
|
||||
@RequestParam("code") String code,
|
||||
@RequestParam(value = "rememberMe", required = false) boolean rememberMe,
|
||||
Model model, HttpSession session, HttpServletResponse response) {
|
||||
// 检查验证码
|
||||
String kaptcha = (String) session.getAttribute("kaptcha");
|
||||
if (StringUtils.isBlank(kaptcha) || StringUtils.isBlank(code) || !kaptcha.equalsIgnoreCase(code)) {
|
||||
model.addAttribute("codeMsg", "验证码错误");
|
||||
return "/site/login";
|
||||
}
|
||||
|
||||
// 凭证过期时间(是否记住我)
|
||||
int expiredSeconds = rememberMe ? REMEMBER_EXPIRED_SECONDS : DEFAULT_EXPIRED_SECONDS;
|
||||
// 验证用户名和密码
|
||||
Map<String, Object> map = userService.login(username, password, expiredSeconds);
|
||||
if (map.containsKey("ticket")) {
|
||||
// 账号和密码均正确,则服务端会生成 ticket,浏览器通过 cookie 存储 ticket
|
||||
Cookie cookie = new Cookie("ticket", map.get("ticket").toString());
|
||||
cookie.setPath(contextPath); // cookie 有效范围
|
||||
cookie.setMaxAge(expiredSeconds);
|
||||
response.addCookie(cookie);
|
||||
return "redirect:/index";
|
||||
}
|
||||
else {
|
||||
model.addAttribute("usernameMsg", map.get("usernameMsg"));
|
||||
model.addAttribute("passwordMsg", map.get("passwordMsg"));
|
||||
return "/site/login";
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
/**
|
||||
* 用户登出
|
||||
* @param ticket 设置凭证状态为无效
|
||||
* @return
|
||||
*/
|
||||
@GetMapping("/logout")
|
||||
public String logout(@CookieValue("ticket") String ticket) {
|
||||
userService.logout(ticket);
|
||||
return "redirect:/login";
|
||||
}
|
||||
```
|
||||
|
||||
|
||||
|
||||
```java
|
||||
@RequestParam(value = "rememberMe", required = false) boolean rememberMe
|
||||
```
|
||||
|
||||
表示前端没有传来 rememberMe 参数也可,因为用户是可能不勾选记住我的
|
||||
|
||||
## 前端
|
||||
|
||||
```html
|
||||
<form class="mt-5" method="post" th:action="@{/login}">
|
||||
|
||||
<input type="text" th:class="|form-control ${usernameMsg != null ? 'is-invalid' : ''}|"
|
||||
th:value="${param.username}"
|
||||
name="username" required>
|
||||
<div class="invalid-feedback" th:text="${usernameMsg}"></div>
|
||||
```
|
||||
|
@ -1,120 +0,0 @@
|
||||
# 显示登录信息
|
||||
|
||||
---
|
||||
|
||||
## 定义拦截器(实现 HandlerInterceptor)
|
||||
|
||||
- 在请求开始时查询登录用户
|
||||
|
||||
- **在本次请求中持有用户数据**
|
||||
|
||||
通过 `ThreadLocal` 把用户数据存入当前线程对应的 map 里,只要本次请求未处理完,这个线程就一直还在,当请求处理完即服务器对本次请求做出响应后,这个线程被销毁
|
||||
|
||||
- 在模板视图上显示用户数据
|
||||
|
||||
- 在请求结束时清理用户数据
|
||||
|
||||
|
||||
|
||||

|
||||
|
||||
```java
|
||||
@Component
|
||||
public class LoginTicketInterceptor implements HandlerInterceptor {
|
||||
|
||||
@Autowired
|
||||
private UserService userService;
|
||||
|
||||
@Autowired
|
||||
private HostHolder hostHolder;
|
||||
|
||||
/**
|
||||
* 在 Controller 执行之前被调用
|
||||
* @param request
|
||||
* @param response
|
||||
* @param handler
|
||||
* @return
|
||||
* @throws Exception
|
||||
*/
|
||||
@Override
|
||||
public boolean preHandle(HttpServletRequest request, HttpServletResponse response, Object handler) throws Exception {
|
||||
// 从 cookie 中获取凭证
|
||||
String ticket = CookieUtil.getValue(request, "ticket");
|
||||
if (ticket != null) {
|
||||
// 查询凭证
|
||||
LoginTicket loginTicket = userService.findLoginTicket(ticket);
|
||||
// 检查凭证状态(是否有效)以及是否过期
|
||||
if (loginTicket != null && loginTicket.getStatus() == 0 && loginTicket.getExpired().after(new Date())) {
|
||||
// 根据凭证查询用户
|
||||
User user = userService.findUserById(loginTicket.getUserId());
|
||||
// 在本次请求中持有用户信息
|
||||
hostHolder.setUser(user);
|
||||
}
|
||||
}
|
||||
|
||||
return true;
|
||||
}
|
||||
|
||||
/**
|
||||
* 在模板引擎之前被调用
|
||||
* @param request
|
||||
* @param response
|
||||
* @param handler
|
||||
* @param modelAndView
|
||||
* @throws Exception
|
||||
*/
|
||||
@Override
|
||||
public void postHandle(HttpServletRequest request, HttpServletResponse response, Object handler, ModelAndView modelAndView) throws Exception {
|
||||
User user = hostHolder.getUser();
|
||||
if (user != null && modelAndView != null) {
|
||||
modelAndView.addObject("loginUser", user);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* 在 Controller 执行之后(即服务端对本次请求做出响应后)被调用
|
||||
* @param request
|
||||
* @param response
|
||||
* @param handler
|
||||
* @param ex
|
||||
* @throws Exception
|
||||
*/
|
||||
@Override
|
||||
public void afterCompletion(HttpServletRequest request, HttpServletResponse response, Object handler, Exception ex) throws Exception {
|
||||
hostHolder.clear();
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
|
||||
|
||||
## 配置拦截器
|
||||
|
||||
```java
|
||||
/**
|
||||
* 拦截器配置类
|
||||
*/
|
||||
@Configuration
|
||||
public class WebMvcConfig implements WebMvcConfigurer {
|
||||
|
||||
@Autowired
|
||||
private LoginTicketInterceptor loginTicketInterceptor;
|
||||
|
||||
@Override
|
||||
public void addInterceptors(InterceptorRegistry registry) {
|
||||
registry.addInterceptor(loginTicketInterceptor)
|
||||
.excludePathPatterns("/css/**", "/js/**", "/img/**");
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## 前端
|
||||
|
||||
```html
|
||||
<li th:if="${loginUser == null}">
|
||||
<a th:href="@{/register}">注册</a>
|
||||
</li>
|
||||
|
||||
<img th:src="${loginUser.headerUrl}" />
|
||||
```
|
||||
|
233
docs/70-账号设置.md
233
docs/70-账号设置.md
@ -1,233 +0,0 @@
|
||||
# 账号设置
|
||||
|
||||
---
|
||||
|
||||
## 修改头像(上传文件)
|
||||
|
||||
- 请求:必须是 Post 请求
|
||||
- 表单:enctype = "multipart/form-data"
|
||||
- Spring MVC:通过 MultipartFile 处理上传文件
|
||||
|
||||
### DAO
|
||||
|
||||
```java
|
||||
/**
|
||||
* 修改头像
|
||||
* @param id
|
||||
* @param headerUrl
|
||||
* @return
|
||||
*/
|
||||
int updateHeader(int id, String headerUrl);
|
||||
```
|
||||
|
||||
|
||||
|
||||
### Service
|
||||
|
||||
```java
|
||||
/**
|
||||
* 修改用户头像
|
||||
* @param userId
|
||||
* @param headUrl
|
||||
* @return
|
||||
*/
|
||||
public int updateHeader(int userId, String headUrl) {
|
||||
return userMapper.updateHeader(userId, headUrl);
|
||||
}
|
||||
```
|
||||
|
||||
### Controller
|
||||
|
||||
```java
|
||||
@Controller
|
||||
@RequestMapping("/user")
|
||||
public class UserController {
|
||||
|
||||
......
|
||||
|
||||
/**
|
||||
* 修改用户头像
|
||||
* @param headerImage
|
||||
* @param model
|
||||
* @return
|
||||
*/
|
||||
@PostMapping("/upload")
|
||||
public String uploadHeader(MultipartFile headerImage, Model model) {
|
||||
if (headerImage == null) {
|
||||
model.addAttribute("error", "您还没有选择图片");
|
||||
return "/site/setting";
|
||||
}
|
||||
|
||||
// 获取文件后缀名
|
||||
String fileName = headerImage.getOriginalFilename();
|
||||
String suffix = fileName.substring(fileName.lastIndexOf(".")); // 文件后缀名
|
||||
if (StringUtils.isBlank(suffix)) {
|
||||
model.addAttribute("error", "图片文件格式不正确");
|
||||
return "/site/setting";
|
||||
}
|
||||
|
||||
// 对用户上传的图片文件进行重新随机命名
|
||||
fileName = CommunityUtil.generateUUID() + suffix;
|
||||
// 确定文件存放的路径
|
||||
File dest = new File(uploadPath + "/" + fileName);
|
||||
try {
|
||||
// 存储图片文件
|
||||
headerImage.transferTo(dest);
|
||||
} catch (IOException e) {
|
||||
logger.error("上传文件失败" + e.getMessage());
|
||||
throw new RuntimeException("上传文件失败,服务器发生异常", e);
|
||||
}
|
||||
|
||||
// 更新当前用户的头像的路径(web 访问路径)
|
||||
// http://localhost:8080/echo/user/header/xxx.png
|
||||
User user = hostHolder.getUser();
|
||||
String headUrl = domain + contextPath + "/user/header/" + fileName;
|
||||
userService.updateHeader(user.getId(), headUrl);
|
||||
|
||||
return "redirect:/index";
|
||||
|
||||
}
|
||||
|
||||
/**
|
||||
* 响应/显示用户上传的图片文件
|
||||
* 即解析当前用户头像的路径(web 访问路径)http://localhost:8080/echo/user/header/fileName
|
||||
* @param fileName
|
||||
* @param response
|
||||
*/
|
||||
@GetMapping("/header/{fileName}")
|
||||
public void getHeader(@PathVariable("fileName") String fileName, HttpServletResponse response) {
|
||||
// 图片存放在服务器上的路径
|
||||
fileName = uploadPath + "/" + fileName;
|
||||
// 图片文件后缀名
|
||||
String suffix = fileName.substring(fileName.lastIndexOf("."));
|
||||
// 响应图片
|
||||
response.setContentType("image/" + suffix);
|
||||
try (
|
||||
FileInputStream fis = new FileInputStream(fileName);
|
||||
OutputStream os = response.getOutputStream();
|
||||
) {
|
||||
byte[] buffer = new byte[1024];
|
||||
int b = 0;
|
||||
while ((b = fis.read(buffer)) != -1) {
|
||||
os.write(buffer, 0, b);
|
||||
}
|
||||
} catch (IOException e) {
|
||||
logger.error("读取文件失败" + e.getMessage());
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
其中:
|
||||
|
||||
```java
|
||||
try (
|
||||
FileInputStream fis = new FileInputStream(fileName);
|
||||
OutputStream os = response.getOutputStream();
|
||||
) {
|
||||
|
||||
}
|
||||
```
|
||||
|
||||
这样的写法的好处就是:写在括号里的流会被自动的关闭,不用我们在finally 里面手写关闭语句
|
||||
|
||||
### 前端
|
||||
|
||||
```html
|
||||
<form method="post" enctype="multipart/form-data" th:action="@{/user/upload}">
|
||||
|
||||
<input type="file" th:class="|custom-file-input ${error != null ? 'is-invalid' : ''}|"
|
||||
name="headerImage">
|
||||
<div class="invalid-feedback" th:text="${error}"></div>
|
||||
```
|
||||
|
||||
```xml
|
||||
<!--修改用户头像-->
|
||||
<update id="updateHeader">
|
||||
update user set header_url = #{headerUrl} where id = #{id}
|
||||
</update>
|
||||
```
|
||||
|
||||
## 修改密码
|
||||
|
||||
### DAO
|
||||
|
||||
```java
|
||||
/**
|
||||
* 修改密码
|
||||
* @param id
|
||||
* @param password 新密码
|
||||
* @return
|
||||
*/
|
||||
int updatePassword(int id, String password);
|
||||
```
|
||||
|
||||
```xml
|
||||
<!--修改密码-->
|
||||
<update id="updatePassword">
|
||||
update user set password = #{password} where id = #{id}
|
||||
</update>
|
||||
```
|
||||
|
||||
|
||||
|
||||
### Service
|
||||
|
||||
```java
|
||||
/**
|
||||
* 修改用户密码(对新密码加盐加密存入数据库)
|
||||
* @param userId
|
||||
* @param newPassword 新密码
|
||||
* @return
|
||||
*/
|
||||
public int updatePassword(int userId, String newPassword) {
|
||||
User user = userMapper.selectById(userId);
|
||||
newPassword = CommunityUtil.md5(newPassword + user.getSalt()); // 重新加盐加密
|
||||
return userMapper.updatePassword(userId, newPassword);
|
||||
}
|
||||
```
|
||||
|
||||
### Controller
|
||||
|
||||
```java
|
||||
/**
|
||||
* 修改用户密码
|
||||
* @param oldPassword 原密码
|
||||
* @param newPassword 新密码
|
||||
* @param model
|
||||
* @return
|
||||
*/
|
||||
@PostMapping("/password")
|
||||
public String updatePassword(String oldPassword, String newPassword, Model model) {
|
||||
// 验证原密码是否正确
|
||||
User user = hostHolder.getUser();
|
||||
String md5OldPassword = CommunityUtil.md5(oldPassword + user.getSalt());
|
||||
if (!user.getPassword().equals(md5OldPassword)) {
|
||||
model.addAttribute("oldPasswordError", "原密码错误");
|
||||
return "/site/setting";
|
||||
}
|
||||
|
||||
// 判断新密码是否合法
|
||||
String md5NewPassword = CommunityUtil.md5(newPassword + user.getSalt());
|
||||
if (user.getPassword().equals(md5NewPassword)) {
|
||||
model.addAttribute("newPasswordError", "新密码和原密码相同");
|
||||
return "/site/setting";
|
||||
}
|
||||
|
||||
// 修改用户密码
|
||||
userService.updatePassword(user.getId(), newPassword);
|
||||
|
||||
return "redirect:/index";
|
||||
}
|
||||
```
|
||||
|
||||
|
||||
|
||||
### 前端
|
||||
|
||||
```html
|
||||
<form method="post" th:action="@{/user/password}">
|
||||
<input th:class="|form-control ${oldPasswordError != null ? 'is-invalid' : ''}|"
|
||||
name = "oldPassword" required>
|
||||
<div class="invalid-feedback" th:text="${oldPasswordError}"></div>
|
||||
```
|
||||
|
@ -1,72 +0,0 @@
|
||||
# 检查登录状态
|
||||
|
||||
---
|
||||
|
||||
防止用户未登录,但是通过访问路径直接进入需要登录的界面(比如账户设置)
|
||||
|
||||
使用拦截器:
|
||||
|
||||
- 在方法前标注**自定义注解**
|
||||
|
||||
```java
|
||||
/**
|
||||
* 该注解配合拦截器使用,指明某个方法是否需要登录才能访问
|
||||
*/
|
||||
@Target(ElementType.METHOD)
|
||||
@Retention(RetentionPolicy.RUNTIME)
|
||||
public @interface LoginRequired {
|
||||
}
|
||||
```
|
||||
|
||||
在修改密码、修改头像和跳转至账户设置这三个方法上加上 `@LoginRequired` 注解
|
||||
|
||||
- 拦截所有的请求,只处理带有该注解的方法
|
||||
|
||||
```java
|
||||
@Component
|
||||
public class LoginRequiredInterceptor implements HandlerInterceptor {
|
||||
|
||||
@Autowired
|
||||
private HostHolder hostHolder;
|
||||
|
||||
/**
|
||||
* 在 Controller 执行之前被调用
|
||||
* 判断某个方法是否存在 LoginRequired 注解
|
||||
* 该存在该注解且用户未登录,则拒绝后续请求,跳转至登录界面
|
||||
* @param request
|
||||
* @param response
|
||||
* @param handler
|
||||
* @return
|
||||
* @throws Exception
|
||||
*/
|
||||
@Override
|
||||
public boolean preHandle(HttpServletRequest request, HttpServletResponse response, Object handler) throws Exception {
|
||||
if (handler instanceof HandlerMethod) {
|
||||
HandlerMethod handlerMethod = (HandlerMethod) handler;
|
||||
Method method = handlerMethod.getMethod();
|
||||
LoginRequired loginRequired = method.getAnnotation(LoginRequired.class);
|
||||
if (loginRequired != null && hostHolder.getUser() == null) {
|
||||
response.sendRedirect(request.getContextPath() + "/login");
|
||||
return false; // 拒绝后续请求
|
||||
}
|
||||
}
|
||||
return true;
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
配置:
|
||||
|
||||
```java
|
||||
// 对除静态资源外所有路径进行拦截
|
||||
registry.addInterceptor(loginRequiredInterceptor)
|
||||
.excludePathPatterns("/css/**", "/js/**", "/img/**");
|
||||
```
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
204
docs/90-过滤敏感词.md
204
docs/90-过滤敏感词.md
@ -1,204 +0,0 @@
|
||||
# 过滤敏感词
|
||||
|
||||
---
|
||||
|
||||
采用数据结构 - 前缀树:
|
||||
|
||||
- 名称:Trie、字典树、查找树
|
||||
- 特点:查找效率高,消耗内存大
|
||||
- 前缀树的根节点为空,其余节点只包含一个字符
|
||||
- 一条路径就是一个字符串(到叶子节点)
|
||||
- 每个节点的所有子节点包含的字符串不同(也即相同的要合并)
|
||||
- 应用:字符串检索、词频统计、字符串排序
|
||||
|
||||

|
||||
|
||||

|
||||
|
||||
敏感词过滤器:
|
||||
|
||||
- 定义前缀树
|
||||
|
||||
- 根据敏感词,初始化前缀树;
|
||||
|
||||
- 编写过滤敏感词的方法
|
||||
|
||||
### 定义前缀树
|
||||
|
||||
```java
|
||||
/**
|
||||
* 定义前缀树
|
||||
*/
|
||||
private class TrieNode {
|
||||
// 关键词结束标识(叶子节点)
|
||||
private boolean isKeywordEnd = false;
|
||||
// 子节点(key:子节点字符, value:子节点类型)
|
||||
private Map<Character, TrieNode> subNodes = new HashMap<>();
|
||||
|
||||
public boolean isKeywordEnd() {
|
||||
return isKeywordEnd;
|
||||
}
|
||||
|
||||
public void setKeywordEnd(boolean keywordEnd) {
|
||||
isKeywordEnd = keywordEnd;
|
||||
}
|
||||
|
||||
// 添加子节点
|
||||
public void addSubNode(Character c, TrieNode node) {
|
||||
subNodes.put(c, node);
|
||||
}
|
||||
|
||||
// 获取子节点
|
||||
public TrieNode getSubNode(Character c) {
|
||||
return subNodes.get(c);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
|
||||
|
||||
### 根据敏感词,初始化前缀树;
|
||||
|
||||
` @PostConstruct ` 在容器实例化这个类(容器在启动的时候就会实例化),并调用这个类的构造器之后(由用户调用构造器),该注解标注的方法就会被自动调用
|
||||
|
||||
```java
|
||||
@Component
|
||||
public class SensitiveFilter {
|
||||
|
||||
private static final Logger logger = LoggerFactory.getLogger(SensitiveFilter.class);
|
||||
|
||||
// 将敏感词替换成 ***
|
||||
private static final String REPLACEMENT = "***";
|
||||
|
||||
// 根节点
|
||||
private TrieNode rootNode = new TrieNode();
|
||||
|
||||
/**
|
||||
* 初始化前缀树
|
||||
*/
|
||||
@PostConstruct // 初始化方法
|
||||
public void init() {
|
||||
try (
|
||||
InputStream is = this.getClass().getClassLoader().getResourceAsStream("sensitive-words.txt");
|
||||
BufferedReader reader = new BufferedReader(new InputStreamReader(is));
|
||||
) {
|
||||
String keyword;
|
||||
while ((keyword = reader.readLine()) != null) {
|
||||
// 添加到前缀树
|
||||
this.addKeyword(keyword);
|
||||
}
|
||||
} catch (IOException e) {
|
||||
logger.error("加载敏感词文件失败" + e.getMessage());
|
||||
}
|
||||
|
||||
|
||||
}
|
||||
|
||||
/**
|
||||
* 将一个敏感词添加进前缀树中
|
||||
* @param keyword
|
||||
*/
|
||||
private void addKeyword(String keyword) {
|
||||
TrieNode tempNode = rootNode;
|
||||
for (int i = 0; i < keyword.length(); i ++) {
|
||||
char c = keyword.charAt(i);
|
||||
TrieNode subNode = tempNode.getSubNode(c);// 首先判断是否存在相同子节点
|
||||
|
||||
if (subNode == null) {
|
||||
subNode = new TrieNode(); // 初始化子节点
|
||||
tempNode.addSubNode(c, subNode); // 添加子节点
|
||||
}
|
||||
|
||||
// 指向子节点,进入下一层循环
|
||||
tempNode = subNode;
|
||||
|
||||
// 设置结束标识(叶子节点),表示这个字符是该敏感词的最后一个字符
|
||||
if (i == keyword.length() - 1) {
|
||||
tempNode.setKeywordEnd(true);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
```
|
||||
|
||||
|
||||
|
||||
### 编写过滤敏感词的方法
|
||||
|
||||
```java
|
||||
/**
|
||||
* 过滤敏感词
|
||||
* @param text 待过滤的文本
|
||||
* @return 过滤后的文本(即用 *** 替代敏感词)
|
||||
*/
|
||||
public String filter(String text) {
|
||||
if (StringUtils.isBlank(text)) {
|
||||
return null;
|
||||
}
|
||||
|
||||
// 指针 1:前缀树的工作指针
|
||||
TrieNode tempNode = rootNode;
|
||||
// 指针 2:指向文本中某个敏感词的第一位
|
||||
int begin = 0;
|
||||
// 指针 3;指向文本中某个敏感词的最后一位
|
||||
int end = 0;
|
||||
|
||||
// 记录过滤后的文本(结果)
|
||||
StringBuilder sb = new StringBuilder();
|
||||
|
||||
while (end < text.length()) {
|
||||
char c = text.charAt(end);
|
||||
// 跳过符号(防止敏感词混合符号,比如 ☆赌☆博)
|
||||
if (isSymbol(c)) {
|
||||
// 若指针 1 处于根节点,则将此符号计入结果(直接忽略),让指针 2 向下走一步
|
||||
if (tempNode == rootNode) {
|
||||
sb.append(c);
|
||||
begin ++;
|
||||
}
|
||||
// 无论符号在开头还是在中间,指针 3 都会向下走一步
|
||||
end ++;
|
||||
continue;
|
||||
}
|
||||
|
||||
// 检查子节点
|
||||
tempNode = tempNode.getSubNode(c);
|
||||
if (tempNode == null) {
|
||||
// 以指针 begin 开头的字符串不是敏感词
|
||||
sb.append(text.charAt(begin));
|
||||
// 进入下一位的判断
|
||||
begin ++;
|
||||
end = begin;
|
||||
// 指针 1 重新指向根节点
|
||||
tempNode = rootNode;
|
||||
}
|
||||
else if (tempNode.isKeywordEnd()) {
|
||||
// 发现敏感词,将 begin~end 的字符串替换掉
|
||||
sb.append(REPLACEMENT);
|
||||
// 进入下一位的判断
|
||||
end ++;
|
||||
begin = end;
|
||||
// 指针 1 重新指向根节点
|
||||
tempNode = rootNode;
|
||||
}
|
||||
else {
|
||||
// 检查下一个字符
|
||||
end ++;
|
||||
}
|
||||
}
|
||||
|
||||
// 将最后一批字符计入结果(如果最后一次循环的字符串不是敏感词,上述的循环逻辑不会将其加入最终结果)
|
||||
sb.append(text.substring(begin));
|
||||
|
||||
return sb.toString();
|
||||
}
|
||||
|
||||
// 判断某个字符是否是符号
|
||||
private boolean isSymbol(Character c) {
|
||||
// 0x2E80~0x9FFF 是东亚文字范围
|
||||
return !CharUtils.isAsciiAlphanumeric(c) && (c < 0x2E80 || c > 0x9FFF);
|
||||
}
|
||||
```
|
||||
|
||||
|
||||
|
537
docs/Guide.md
Normal file
537
docs/Guide.md
Normal file
@ -0,0 +1,537 @@
|
||||
# Echo — 开源社区系统
|
||||
|
||||
---
|
||||
|
||||
## 📚 从本项目你能学到什么
|
||||
|
||||
- 学会主流的 Java Web 开发技术和框架(Spring、SpringBoot、Spring MVC、MyBatis、MySQL、Redis、Kafka、Elasticsearch 等)
|
||||
- 了解一个真实的 Web 项目从开发到部署的整个流程(本项目配套有大量图例和详细教程,以帮助小伙伴快速上手)
|
||||
- 掌握本项目中涉及的核心技术点以及常见面试题和解析
|
||||
|
||||
## 🏄 在线体验与文档地址
|
||||
|
||||
- 在线体验:项目已经部署到腾讯云服务器,各位小伙伴们可直接线上体验:[http://1.15.127.74/](http://1.15.127.74/)
|
||||
- 文档地址:文档通过 Vuepress + Gitee Pages 生成,在线访问地址:
|
||||
|
||||
## 💻 核心技术栈
|
||||
|
||||
后端:
|
||||
|
||||
- Spring
|
||||
- Spring Boot 2.1.5 RELEASE
|
||||
- Spring MVC
|
||||
- ORM:MyBatis
|
||||
- 数据库:MySQL 5.7
|
||||
- 分布式缓存:Redis
|
||||
- 本地缓存:Caffeine
|
||||
- 消息队列:Kafka 2.13-2.7.0
|
||||
- 搜索引擎:Elasticsearch 6.4.3
|
||||
- 安全:Spring Security
|
||||
- 邮件:Spring Mail
|
||||
- 分布式定时任务:Spring Quartz
|
||||
- 日志:SLF4J(日志接口) + Logback(日志实现)
|
||||
|
||||
前端:
|
||||
|
||||
- Thymeleaf
|
||||
- Bootstrap 4.x
|
||||
- Jquery
|
||||
- Ajax
|
||||
|
||||
## 🔨 开发环境
|
||||
|
||||
- 操作系统:Windows 10
|
||||
- 构建工具:Apache Maven
|
||||
- 集成开发工具:Intellij IDEA
|
||||
- 应用服务器:Apache Tomcat
|
||||
- 接口测试工具:Postman
|
||||
- 压力测试工具:Apache JMeter
|
||||
- 版本控制工具:Git
|
||||
- Java 版本:8
|
||||
|
||||
## 🎀 界面展示
|
||||
|
||||
首页:
|
||||
|
||||

|
||||
|
||||
登录页:
|
||||
|
||||

|
||||
|
||||
帖子详情页:
|
||||
|
||||

|
||||
|
||||
个人主页:
|
||||
|
||||

|
||||
|
||||
朋友私信页:
|
||||
|
||||

|
||||
|
||||
私信详情页:
|
||||
|
||||

|
||||
|
||||
系统通知页:
|
||||
|
||||

|
||||
|
||||
通知详情页:
|
||||
|
||||

|
||||
|
||||
账号设置页:
|
||||
|
||||

|
||||
|
||||
数据统计页:
|
||||
|
||||

|
||||
|
||||
搜索详情页:
|
||||
|
||||

|
||||
|
||||
## 🎨 功能列表
|
||||
|
||||

|
||||
|
||||
- [x] **注册**
|
||||
|
||||
- 用户注册成功,将用户信息存入 MySQL,但此时该用户状态为未激活
|
||||
- 向用户发送激活邮件,用户点击链接则激活账号(Spring Mail)
|
||||
|
||||
- [x] **登录 | 登出**
|
||||
|
||||
- 进入登录界面,动态生成验证码,并将验证码短暂存入 Redis(60 秒)
|
||||
|
||||
- 用户登录成功(验证用户名、密码、验证码),生成登录凭证且设置状态为有效,并将登录凭证存入 Redis
|
||||
|
||||
注意:登录凭证存在有效期,在所有的请求执行之前,都会检查凭证是否有效和是否过期,只要该用户的凭证有效并在有效期时间内,本次请求就会一直持有该用户信息(使用 ThreadLocal 持有用户信息)
|
||||
|
||||
- 勾选记住我,则延长登录凭证有效时间
|
||||
|
||||
- 用户登录成功,将用户信息短暂存入 Redis(1 小时)
|
||||
|
||||
- 用户登出,将凭证状态设为无效,并更新 Redis 中该用户的登录凭证信息
|
||||
|
||||
- [x] **账号设置**
|
||||
|
||||
- 修改头像
|
||||
- 将用户选择的头像图片文件上传至七牛云服务器
|
||||
- 修改密码
|
||||
|
||||
- [x] **帖子模块**
|
||||
|
||||
- 发布帖子(过滤敏感词),将其存入 MySQL
|
||||
- 分页显示所有的帖子
|
||||
- 支持按照 “发帖时间” 显示
|
||||
- 支持按照 “热度排行” 显示(Spring Quartz)
|
||||
- 查看帖子详情
|
||||
- 权限管理(Spring Security + Thymeleaf Security)
|
||||
- 未登录用户无法发帖
|
||||
- “版主” 可以看到帖子的置顶和加精按钮并执行相应操作
|
||||
- “管理员” 可以看到帖子的删除按钮并执行相应操作
|
||||
- “普通用户” 无法看到帖子的置顶、加精、删除按钮,也无法执行相应操作
|
||||
|
||||
- [x] **评论模块**
|
||||
|
||||
- 发布对帖子的评论(过滤敏感词),将其存入 MySQL
|
||||
- 分页显示评论
|
||||
- 发布对评论的回复(过滤敏感词)
|
||||
- 权限管理(Spring Security)
|
||||
- 未登录用户无法使用评论功能
|
||||
|
||||
- [x] **私信模块**
|
||||
|
||||
- 发送私信(过滤敏感词)
|
||||
- 私信列表
|
||||
- 查询当前用户的会话列表
|
||||
- 每个会话只显示一条最新的私信
|
||||
- 支持分页显示
|
||||
- 私信详情
|
||||
- 查询某个会话所包含的所有私信
|
||||
- 访问私信详情时,将显示的私信设为已读状态
|
||||
- 支持分页显示
|
||||
- 权限管理(Spring Security)
|
||||
- 未登录用户无法使用私信功能
|
||||
|
||||
- [x] **统一处理 404 / 500 异常**
|
||||
|
||||
- 普通请求异常
|
||||
- 异步请求异常
|
||||
|
||||
- [x] **统一记录日志**
|
||||
|
||||
- [x] **点赞模块**
|
||||
|
||||
- 支持对帖子、评论/回复点赞
|
||||
- 第 1 次点赞,第 2 次取消点赞
|
||||
- 首页统计帖子的点赞数量
|
||||
- 详情页统计帖子和评论/回复的点赞数量
|
||||
- 详情页显示当前登录用户的点赞状态(赞过了则显示已赞)
|
||||
|
||||
- 统计我的获赞数量
|
||||
- 权限管理(Spring Security)
|
||||
- 未登录用户无法使用点赞相关功能
|
||||
|
||||
- [x] **关注模块**
|
||||
|
||||
- 关注功能
|
||||
- 取消关注功能
|
||||
- 统计用户的关注数和粉丝数
|
||||
- 我的关注列表(查询某个用户关注的人),支持分页
|
||||
- 我的粉丝列表(查询某个用户的粉丝),支持分页
|
||||
- 权限管理(Spring Security)
|
||||
- 未登录用户无法使用关注相关功能
|
||||
|
||||
- [x] **系统通知模块**
|
||||
|
||||
- 通知列表
|
||||
- 显示评论、点赞、关注三种类型的通知
|
||||
- 通知详情
|
||||
- 分页显示某一类主题所包含的通知
|
||||
- 进入某种类型的系统通知详情,则将该页的所有未读的系统通知状态设置为已读
|
||||
- 未读数量
|
||||
- 分别显示每种类型的系统通知的未读数量
|
||||
- 显示所有系统通知的未读数量
|
||||
- 导航栏显示所有消息的未读数量(未读私信 + 未读系统通知)
|
||||
- 权限管理(Spring Security)
|
||||
- 未登录用户无法使用系统通知功能
|
||||
|
||||
- [x] **搜索模块**
|
||||
|
||||
- 发布事件
|
||||
- 发布帖子时,通过消息队列将帖子异步地提交到 Elasticsearch 服务器
|
||||
- 为帖子增加评论时,通过消息队列将帖子异步地提交到 Elasticsearch 服务器
|
||||
- 搜索服务
|
||||
- 从 Elasticsearch 服务器搜索帖子
|
||||
- 从 Elasticsearch 服务器删除帖子(当帖子从数据库中被删除时)
|
||||
- 显示搜索结果
|
||||
|
||||
- [x] **网站数据统计**(管理员专属)
|
||||
|
||||
- 独立访客 UV
|
||||
- 存入 Redis 的 HyperLogLog
|
||||
- 支持单日查询和区间日期查询
|
||||
- 日活跃用户 DAU
|
||||
- 存入 Redis 的 Bitmap
|
||||
- 支持单日查询和区间日期查询
|
||||
- 权限管理(Spring Security)
|
||||
- 只有管理员可以查看网站数据统计
|
||||
|
||||
- [x] 优化网站性能
|
||||
|
||||
- 使用本地缓存 Caffeine 缓存热帖列表以及所有用户帖子的总数
|
||||
|
||||
## 🔐 待实现及优化
|
||||
|
||||
以下是我个人发现的本项目存在的问题,但是暂时没有头绪无法解决,集思广益,欢迎各位小伙伴提 PR 解决:
|
||||
|
||||
- [ ] 注册模块无法正常跳转到操作提示界面(本地运行没有问题)
|
||||
- [ ] 评论功能的前端显示部分存在 Bug
|
||||
- [ ] 查询我的评论(未完善)
|
||||
|
||||
以下是我觉得本项目还可以添加的功能,同样欢迎各位小伙伴提 issue 指出还可以增加哪些功能,或者直接提 PR 实现该功能:
|
||||
|
||||
- [ ] 忘记密码(发送邮件找回密码)
|
||||
- [ ] 查询我的点赞
|
||||
- [ ] 管理员对帖子的二次点击取消置顶功能
|
||||
- [ ] 管理员对已删除帖子的恢复功能(本项目中的删除帖子并未将其从数据库中删除,只是将其状态设置为了拉黑)
|
||||
|
||||
## 🌱 本地运行
|
||||
|
||||
各位如果需要将项目部署在本地进行测试,以下环境请提前备好:
|
||||
|
||||
- Java 8
|
||||
- MySQL 5.7
|
||||
- Redis
|
||||
- Kafka 2.13-2.7.0
|
||||
- Elasticsearch 6.4.3
|
||||
|
||||
然后**修改配置文件中的信息为你自己的本地环境,直接运行是运行不了的**,而且相关私密信息我全部用 xxxxxxx 代替了。
|
||||
|
||||
本地运行需要修改的配置文件信息如下:
|
||||
|
||||
1)`application-develop.properties`:
|
||||
|
||||
- MySQL
|
||||
- Spring Mail(邮箱需要开启 SMTP 服务)
|
||||
- Kafka:consumer.group-id(该字段见 Kafka 安装包中的 consumer.proerties,可自行修改, 修改完毕后需要重启 Kafka)
|
||||
- Elasticsearch:cluster-name(该字段见 Elasticsearch 安装包中的 elasticsearch.yml,可自行修改)
|
||||
- 七牛云(需要新建一个七牛云的对象存储空间,用来存放上传的头像图片)
|
||||
|
||||
2)`logback-spring-develop.xml`:
|
||||
|
||||
- LOG_PATH:日志存放的位置
|
||||
|
||||
每次运行需要打开:
|
||||
|
||||
- MySQL
|
||||
- Redis
|
||||
- Elasticsearch
|
||||
- Kafka
|
||||
|
||||
另外,还需要事件建好数据库表,详细见下文。
|
||||
|
||||
## 📜 数据库设计
|
||||
|
||||
用户 `user`:
|
||||
|
||||
```sql
|
||||
DROP TABLE IF EXISTS `user`;
|
||||
SET character_set_client = utf8mb4 ;
|
||||
CREATE TABLE `user` (
|
||||
`id` int(11) NOT NULL AUTO_INCREMENT,
|
||||
`username` varchar(50) DEFAULT NULL,
|
||||
`password` varchar(50) DEFAULT NULL,
|
||||
`salt` varchar(50) DEFAULT NULL,
|
||||
`email` varchar(100) DEFAULT NULL,
|
||||
`type` int(11) DEFAULT NULL COMMENT '0-普通用户; 1-超级管理员; 2-版主;',
|
||||
`status` int(11) DEFAULT NULL COMMENT '0-未激活; 1-已激活;',
|
||||
`activation_code` varchar(100) DEFAULT NULL,
|
||||
`header_url` varchar(200) DEFAULT NULL,
|
||||
`create_time` timestamp NULL DEFAULT NULL,
|
||||
PRIMARY KEY (`id`),
|
||||
KEY `index_username` (`username`(20)),
|
||||
KEY `index_email` (`email`(20))
|
||||
) ENGINE=InnoDB AUTO_INCREMENT=101 DEFAULT CHARSET=utf8;
|
||||
```
|
||||
|
||||
讨论帖 `discuss_post`:
|
||||
|
||||
```sql
|
||||
DROP TABLE IF EXISTS `discuss_post`;
|
||||
SET character_set_client = utf8mb4 ;
|
||||
CREATE TABLE `discuss_post` (
|
||||
`id` int(11) NOT NULL AUTO_INCREMENT,
|
||||
`user_id` int(11) DEFAULT NULL,
|
||||
`title` varchar(100) DEFAULT NULL,
|
||||
`content` text,
|
||||
`type` int(11) DEFAULT NULL COMMENT '0-普通; 1-置顶;',
|
||||
`status` int(11) DEFAULT NULL COMMENT '0-正常; 1-精华; 2-拉黑;',
|
||||
`create_time` timestamp NULL DEFAULT NULL,
|
||||
`comment_count` int(11) DEFAULT NULL,
|
||||
`score` double DEFAULT NULL,
|
||||
PRIMARY KEY (`id`),
|
||||
KEY `index_user_id` (`user_id`)
|
||||
) ENGINE=InnoDB DEFAULT CHARSET=utf8;
|
||||
```
|
||||
|
||||
评论(回复)`comment`:
|
||||
|
||||
```sql
|
||||
CREATE TABLE `comment` (
|
||||
`id` int(11) NOT NULL AUTO_INCREMENT,
|
||||
`user_id` int(11) DEFAULT NULL,
|
||||
`entity_type` int(11) DEFAULT NULL COMMENT '评论目标的类别:1 帖子;2 评论 ',
|
||||
`entity_id` int(11) DEFAULT NULL COMMENT '评论目标的 id',
|
||||
`target_id` int(11) DEFAULT NULL COMMENT '指明对谁进行评论',
|
||||
`content` text,
|
||||
`status` int(11) DEFAULT NULL COMMENT '状态:0 正常;1 禁用',
|
||||
`create_time` timestamp NULL DEFAULT NULL,
|
||||
PRIMARY KEY (`id`),
|
||||
KEY `index_user_id` (`user_id`),
|
||||
KEY `index_entity_id` (`entity_id`)
|
||||
) ENGINE=InnoDB AUTO_INCREMENT=247 DEFAULT CHARSET=utf8;
|
||||
```
|
||||
|
||||
私信 `message`:
|
||||
|
||||
```sql
|
||||
DROP TABLE IF EXISTS `message`;
|
||||
SET character_set_client = utf8mb4 ;
|
||||
CREATE TABLE `message` (
|
||||
`id` int(11) NOT NULL AUTO_INCREMENT,
|
||||
`from_id` int(11) DEFAULT NULL,
|
||||
`to_id` int(11) DEFAULT NULL,
|
||||
`conversation_id` varchar(45) NOT NULL,
|
||||
`content` text,
|
||||
`status` int(11) DEFAULT NULL COMMENT '0-未读;1-已读;2-删除;',
|
||||
`create_time` timestamp NULL DEFAULT NULL,
|
||||
PRIMARY KEY (`id`),
|
||||
KEY `index_from_id` (`from_id`),
|
||||
KEY `index_to_id` (`to_id`),
|
||||
KEY `index_conversation_id` (`conversation_id`)
|
||||
) ENGINE=InnoDB DEFAULT CHARSET=utf8;
|
||||
```
|
||||
|
||||
## 🌌 理想的部署架构
|
||||
|
||||
我每个都只部署了一台,以下是理想的部署架构:
|
||||
|
||||

|
||||
|
||||
## 🎯 功能逻辑图
|
||||
|
||||
画了一些不是那么严谨的图帮助各位小伙伴理清思绪。
|
||||
|
||||
> 单向绿色箭头:
|
||||
>
|
||||
> - 前端模板 -> Controller:表示这个前端模板中有一个超链接是由这个 Controller 处理的
|
||||
> - Controller -> 前端模板:表示这个 Controller 会像该前端模板传递数据或者跳转
|
||||
>
|
||||
> 双向绿色箭头:表示 Controller 和前端模板之间进行参数的相互传递或使用
|
||||
>
|
||||
> 单向蓝色箭头: A -> B,表示 A 方法调用了 B 方法
|
||||
>
|
||||
> 单向红色箭头:数据库或缓存操作
|
||||
|
||||
### 注册
|
||||
|
||||
- 用户注册成功,将用户信息存入 MySQL,但此时该用户状态为未激活
|
||||
- 向用户发送激活邮件,用户点击链接则激活账号(Spring Mail)
|
||||
|
||||

|
||||
|
||||
### 登录 | 登出
|
||||
|
||||
- 进入登录界面,动态生成验证码,并将验证码短暂存入 Redis(60 秒)
|
||||
|
||||
- 用户登录成功(验证用户名、密码、验证码),生成登录凭证且设置状态为有效,并将登录凭证存入 Redis
|
||||
|
||||
注意:登录凭证存在有效期,在所有的请求执行之前,都会检查凭证是否有效和是否过期,只要该用户的凭证有效并在有效期时间内,本次请求就会一直持有该用户信息(使用 ThreadLocal 持有用户信息)
|
||||
|
||||
- 勾选记住我,则延长登录凭证有效时间
|
||||
|
||||
- 用户登录成功,将用户信息短暂存入 Redis(1 小时)
|
||||
|
||||
- 用户登出,将凭证状态设为无效,并更新 Redis 中该用户的登录凭证信息
|
||||
|
||||
下图是登录模块的功能逻辑图,并没有使用 Spring Security 提供的认证逻辑(我觉得这个模块是最复杂的,这张图其实很多细节还没有画全)
|
||||
|
||||

|
||||
|
||||
### 分页显示所有的帖子
|
||||
|
||||
- 支持按照 “发帖时间” 显示
|
||||
- 支持按照 “热度排行” 显示(Spring Quartz)
|
||||
- 将热帖列表和所有帖子的总数存入本地缓存 Caffeine(利用分布式定时任务 Spring Quartz 每隔一段时间就刷新计算帖子的热度/分数 — 见下文,而 Caffeine 里的数据更新不用我们操心,它天生就会自动的更新它拥有的数据,给它一个初始化方法就完事儿)
|
||||
|
||||

|
||||
|
||||
|
||||
|
||||
### 账号设置
|
||||
|
||||
- 修改头像(异步请求)
|
||||
- 将用户选择的头像图片文件上传至七牛云服务器
|
||||
- 修改密码
|
||||
|
||||
此处只画出修改头像:
|
||||
|
||||

|
||||
|
||||
### 发布帖子(异步请求)
|
||||
|
||||

|
||||
|
||||
### 显示评论及相关信息
|
||||
|
||||
> 评论部分前端的名称显示有些缺陷,有兴趣的小伙伴欢迎提 PR 解决~
|
||||
|
||||
关于评论模块需要注意的就是评论表的设计,把握其中字段的含义,才能透彻了解这个功能的逻辑。
|
||||
|
||||
评论 Comment 的目标类型(帖子,评论) entityType 和 entityId 以及对哪个用户进行评论/回复 targetId 是由前端传递给 DiscussPostController 的
|
||||
|
||||

|
||||
|
||||
一个帖子的详情页需要封装的信息大概如下:
|
||||
|
||||

|
||||
|
||||
### 添加评论(事务管理)
|
||||
|
||||

|
||||
|
||||
### 私信列表和详情页
|
||||
|
||||

|
||||
|
||||
### 发送私信(异步请求)
|
||||
|
||||

|
||||
|
||||
### 点赞(异步请求)
|
||||
|
||||
将点赞相关信息存入 Redis 的数据结构 set 中。其中,key 命名为 `like:entity:entityType:entityId`,value 即点赞用户的 id。比如 key = `like:entity:2:246` value = `11` 表示用户 11 对实体类型 2 即评论进行了点赞,该评论的 id 是 246
|
||||
|
||||
某个用户的获赞数量对应的存储在 Redis 中的 key 是 `like:user:userId`,value 就是这个用户的获赞数量
|
||||
|
||||

|
||||
|
||||
### 我的获赞数量
|
||||
|
||||

|
||||
|
||||
### 关注(异步请求)
|
||||
|
||||
- 若 A 关注了 B,则 A 是 B 的粉丝 Follower,B 是 A 的目标 Followee
|
||||
- 关注的目标可以是用户、帖子、题目等,在实现时将这些目标抽象为实体(目前只做了关注用户)
|
||||
|
||||
将某个用户关注的实体相关信息存储在 Redis 的数据结构 zset 中:key 是 `followee:userId:entityType` ,对应的 value 是 `zset(entityId, now)` ,以关注的时间进行排序。比如说 `followee:111:3` 对应的value `(20, 2020-02-03-xxxx)`,表明用户 111 关注了实体类型为 3 即人(用户),该帖子的 id 是 20,关注该帖子的时间是 2020-02-03-xxxx
|
||||
|
||||
同样的,将某个实体拥有的粉丝相关信息也存储在 Redis 的数据结构 zset 中:key 是 `follower:entityType:entityId`,对应的 value 是 `zset(userId, now)`,以关注的时间进行排序
|
||||
|
||||

|
||||
|
||||
### 关注列表
|
||||
|
||||

|
||||
|
||||
### 发送系统通知
|
||||
|
||||

|
||||
|
||||
### 显示系统通知
|
||||
|
||||

|
||||
|
||||
### 搜索
|
||||
|
||||

|
||||
|
||||
类似的,置顶、加精也会触发发帖事件,就不再图里面画出来了。
|
||||
|
||||
### 置顶加精删除(异步请求)
|
||||
|
||||

|
||||
|
||||
### 网站数据统计
|
||||
|
||||

|
||||
|
||||
### 帖子热度计算
|
||||
|
||||
每次发生点赞(给帖子点赞)、评论(给帖子评论)、加精的时候,就将这些帖子信息存入缓存 Redis 中,然后通过分布式的定时任务 Spring Quartz,每隔一段时间就从缓存中取出这些帖子进行计算分数。
|
||||
|
||||
帖子分数/热度计算公式:分数(热度) = 权重 + 发帖距离天数
|
||||
|
||||
```java
|
||||
// 计算权重
|
||||
double w = (wonderful ? 75 : 0) + commentCount * 10 + likeCount * 2;
|
||||
// 分数 = 权重 + 发帖距离天数
|
||||
double score = Math.log10(Math.max(w, 1))
|
||||
+ (post.getCreateTime().getTime() - epoch.getTime()) / (1000 * 3600 * 24);
|
||||
```
|
||||
|
||||

|
||||
|
||||
## 📖 配套教程
|
||||
|
||||
想要自己从零开始实现这个项目或者深入理解的小伙伴,可以扫描下方二维码关注公众号『**飞天小牛肉**』,第一时间获取配套教程, 不仅会详细解释本项目涉及的各大技术点,还会汇总相关的常见面试题,目前尚在更新中。
|
||||
|
||||
<img src="https://gitee.com/veal98/images/raw/master/img/20210204145531.png" style="zoom:67%;" />
|
||||
|
||||
## 📞 联系我
|
||||
|
||||
有什么问题也可以添加我的微信,记得备注来意:格式 <u>(学校或公司 - 姓名或昵称 - 来意)</u>
|
||||
|
||||
<img width="260px" src="https://gitee.com/veal98/images/raw/master/img/微信图片_20210105121328.jpg" >
|
||||
|
||||
## 👏 鸣谢
|
||||
|
||||
本项目参考[牛客网](https://www.nowcoder.com/) — Java 高级工程师课程,感谢老师和平台
|
@ -1,19 +0,0 @@
|
||||
## IDEA 快捷键
|
||||
|
||||
- ctrl + P:显示方法形参
|
||||
|
||||

|
||||
|
||||
- ctrl + shift + N:查找文件
|
||||
|
||||
- ctrl + N:查找类
|
||||
|
||||
- alt + enter:快速处理异常
|
||||
|
||||
- shift + enter:在当前行的下方开始新行
|
||||
|
||||
- ctrl + E:切换打开的标签页
|
||||
|
||||
- ctrl + alt + V:抽离变量
|
||||
|
||||
- alt + 1:打开/关闭侧边栏
|
16
docs/README.md
Normal file
16
docs/README.md
Normal file
@ -0,0 +1,16 @@
|
||||
---
|
||||
home: true
|
||||
heroImage: https://gitee.com/veal98/images/raw/master/img/20210211175136.png
|
||||
heroText: Great Community — Echo
|
||||
tagline: 🦄 开源社区系统
|
||||
actionText: 快速上手 →
|
||||
actionLink: /Guide
|
||||
features:
|
||||
- title: 主流技术栈
|
||||
details: 基于目前国内主流技术栈开发:SpringBoot + MyBatis + MySQL + Redis + Kafka + Elasticsearch + ...
|
||||
- title: 详细开发文档
|
||||
details: 包含详细文档以及大量图例,帮助读者快速掌握本项目
|
||||
- title: 配套友好教程
|
||||
details: 关注公众号『飞天小牛肉』第一时间获取教程更新,目前尚在连载中
|
||||
footer: MIT Licensed | Copyright © 2021-present 小牛肉
|
||||
---
|
2993
log/community/error/log-error-2021-01-31.0.log
Normal file
2993
log/community/error/log-error-2021-01-31.0.log
Normal file
File diff suppressed because it is too large
Load Diff
44
log/community/error/log-error-2021-02-02.0.log
Normal file
44
log/community/error/log-error-2021-02-02.0.log
Normal file
@ -0,0 +1,44 @@
|
||||
2021-02-02 15:03:05,495 ERROR [restartedMain] o.s.b.SpringApplication [SpringApplication.java:858] Application run failed
|
||||
org.springframework.context.ApplicationContextException: Failed to start bean 'org.springframework.kafka.config.internalKafkaListenerEndpointRegistry'; nested exception is org.apache.kafka.common.errors.TimeoutException: Timeout expired while fetching topic metadata
|
||||
at org.springframework.context.support.DefaultLifecycleProcessor.doStart(DefaultLifecycleProcessor.java:185)
|
||||
at org.springframework.context.support.DefaultLifecycleProcessor.access$200(DefaultLifecycleProcessor.java:53)
|
||||
at org.springframework.context.support.DefaultLifecycleProcessor$LifecycleGroup.start(DefaultLifecycleProcessor.java:360)
|
||||
at org.springframework.context.support.DefaultLifecycleProcessor.startBeans(DefaultLifecycleProcessor.java:158)
|
||||
at org.springframework.context.support.DefaultLifecycleProcessor.onRefresh(DefaultLifecycleProcessor.java:122)
|
||||
at org.springframework.context.support.AbstractApplicationContext.finishRefresh(AbstractApplicationContext.java:893)
|
||||
at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.finishRefresh(ServletWebServerApplicationContext.java:163)
|
||||
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:552)
|
||||
at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.refresh(ServletWebServerApplicationContext.java:142)
|
||||
at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:775)
|
||||
at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:397)
|
||||
at org.springframework.boot.SpringApplication.run(SpringApplication.java:316)
|
||||
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1260)
|
||||
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1248)
|
||||
at com.greate.community.CommunityApplication.main(CommunityApplication.java:20)
|
||||
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
|
||||
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
|
||||
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
|
||||
at java.lang.reflect.Method.invoke(Method.java:498)
|
||||
at org.springframework.boot.devtools.restart.RestartLauncher.run(RestartLauncher.java:49)
|
||||
Caused by: org.apache.kafka.common.errors.TimeoutException: Timeout expired while fetching topic metadata
|
||||
2021-02-02 15:37:40,308 ERROR [restartedMain] o.s.b.d.LoggingFailureAnalysisReporter [LoggingFailureAnalysisReporter.java:42]
|
||||
|
||||
***************************
|
||||
APPLICATION FAILED TO START
|
||||
***************************
|
||||
|
||||
Description:
|
||||
|
||||
Field taskScheduler in com.greate.community.event.EventConsumer required a bean of type 'org.springframework.scheduling.concurrent.ThreadPoolTaskScheduler' that could not be found.
|
||||
|
||||
The injection point has the following annotations:
|
||||
- @org.springframework.beans.factory.annotation.Autowired(required=true)
|
||||
|
||||
The following candidates were found but could not be injected:
|
||||
- Bean method 'taskScheduler' in 'TaskSchedulingAutoConfiguration' not loaded because @ConditionalOnBean (names: org.springframework.context.annotation.internalScheduledAnnotationProcessor; SearchStrategy: all) did not find any beans named org.springframework.context.annotation.internalScheduledAnnotationProcessor
|
||||
|
||||
|
||||
Action:
|
||||
|
||||
Consider revisiting the entries above or defining a bean of type 'org.springframework.scheduling.concurrent.ThreadPoolTaskScheduler' in your configuration.
|
||||
|
2
log/community/error/log-error-2021-02-03.0.log
Normal file
2
log/community/error/log-error-2021-02-03.0.log
Normal file
@ -0,0 +1,2 @@
|
||||
2021-02-03 18:20:11,523 ERROR [main] o.s.d.e.r.s.AbstractElasticsearchRepository [AbstractElasticsearchRepository.java:91] failed to load elasticsearch nodes : org.elasticsearch.client.transport.NoNodeAvailableException: None of the configured nodes are available: [{#transport#-1}{abdCK01zS6qc6XDj4yFspg}{127.0.0.1}{127.0.0.1:9300}]
|
||||
2021-02-03 18:21:26,764 ERROR [main] o.s.d.e.r.s.AbstractElasticsearchRepository [AbstractElasticsearchRepository.java:91] failed to load elasticsearch nodes : org.elasticsearch.client.transport.NoNodeAvailableException: None of the configured nodes are available: [{#transport#-1}{E3dpYkNXSim5WwTAocqn9w}{127.0.0.1}{127.0.0.1:9300}]
|
1
log/community/error/log-error-2021-02-04.0.log
Normal file
1
log/community/error/log-error-2021-02-04.0.log
Normal file
@ -0,0 +1 @@
|
||||
2021-02-04 21:39:56,353 ERROR [restartedMain] o.s.d.e.r.s.AbstractElasticsearchRepository [AbstractElasticsearchRepository.java:91] failed to load elasticsearch nodes : org.elasticsearch.client.transport.NoNodeAvailableException: None of the configured nodes are available: [{#transport#-1}{4947y-PeQqGETsGbd70F9g}{127.0.0.1}{127.0.0.1:9300}]
|
4965
log/community/error/log-error-2021-02-06.0.log
Normal file
4965
log/community/error/log-error-2021-02-06.0.log
Normal file
File diff suppressed because it is too large
Load Diff
546
log/community/error/log-error-2021-02-07.0.log
Normal file
546
log/community/error/log-error-2021-02-07.0.log
Normal file
@ -0,0 +1,546 @@
|
||||
2021-02-07 15:00:47,330 ERROR [http-nio-8080-exec-1] o.t.TemplateEngine [TemplateEngine.java:1136] [THYMELEAF][http-nio-8080-exec-1] Exception processing template "/site/discuss-detail": An error happened during template parsing (template: "class path resource [templates//site/discuss-detail.html]")
|
||||
org.thymeleaf.exceptions.TemplateInputException: An error happened during template parsing (template: "class path resource [templates//site/discuss-detail.html]")
|
||||
at org.thymeleaf.templateparser.markup.AbstractMarkupTemplateParser.parse(AbstractMarkupTemplateParser.java:241)
|
||||
at org.thymeleaf.templateparser.markup.AbstractMarkupTemplateParser.parseStandalone(AbstractMarkupTemplateParser.java:100)
|
||||
at org.thymeleaf.engine.TemplateManager.parseAndProcess(TemplateManager.java:666)
|
||||
at org.thymeleaf.TemplateEngine.process(TemplateEngine.java:1098)
|
||||
at org.thymeleaf.TemplateEngine.process(TemplateEngine.java:1072)
|
||||
at org.thymeleaf.spring5.view.ThymeleafView.renderFragment(ThymeleafView.java:362)
|
||||
at org.thymeleaf.spring5.view.ThymeleafView.render(ThymeleafView.java:189)
|
||||
at org.springframework.web.servlet.DispatcherServlet.render(DispatcherServlet.java:1371)
|
||||
at org.springframework.web.servlet.DispatcherServlet.processDispatchResult(DispatcherServlet.java:1117)
|
||||
at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:1056)
|
||||
at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:942)
|
||||
at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:1005)
|
||||
at org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:897)
|
||||
at javax.servlet.http.HttpServlet.service(HttpServlet.java:634)
|
||||
at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:882)
|
||||
at javax.servlet.http.HttpServlet.service(HttpServlet.java:741)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:231)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)
|
||||
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:53)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)
|
||||
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:320)
|
||||
at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:127)
|
||||
at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:91)
|
||||
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:334)
|
||||
at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:119)
|
||||
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:334)
|
||||
at org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:137)
|
||||
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:334)
|
||||
at org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:111)
|
||||
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:334)
|
||||
at org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:170)
|
||||
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:334)
|
||||
at org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:63)
|
||||
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:334)
|
||||
at org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:116)
|
||||
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:334)
|
||||
at org.springframework.security.web.header.HeaderWriterFilter.doFilterInternal(HeaderWriterFilter.java:74)
|
||||
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
|
||||
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:334)
|
||||
at org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:105)
|
||||
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:334)
|
||||
at org.springframework.security.web.context.request.async.WebAsyncManagerIntegrationFilter.doFilterInternal(WebAsyncManagerIntegrationFilter.java:56)
|
||||
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
|
||||
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:334)
|
||||
at org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:215)
|
||||
at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:178)
|
||||
at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:357)
|
||||
at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:270)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)
|
||||
at org.springframework.web.filter.RequestContextFilter.doFilterInternal(RequestContextFilter.java:99)
|
||||
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)
|
||||
at org.springframework.web.filter.FormContentFilter.doFilterInternal(FormContentFilter.java:92)
|
||||
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)
|
||||
at org.springframework.web.filter.HiddenHttpMethodFilter.doFilterInternal(HiddenHttpMethodFilter.java:93)
|
||||
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)
|
||||
at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:200)
|
||||
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)
|
||||
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:200)
|
||||
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:96)
|
||||
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:490)
|
||||
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:139)
|
||||
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:92)
|
||||
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:74)
|
||||
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:343)
|
||||
at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:408)
|
||||
at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:66)
|
||||
at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:836)
|
||||
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1747)
|
||||
at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:49)
|
||||
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
|
||||
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
|
||||
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
|
||||
at java.lang.Thread.run(Thread.java:748)
|
||||
Caused by: org.attoparser.ParseException: Exception evaluating SpringEL expression: "rvo.target.id" (template: "/site/discuss-detail" - line 107, col 67)
|
||||
at org.attoparser.MarkupParser.parseDocument(MarkupParser.java:393)
|
||||
at org.attoparser.MarkupParser.parse(MarkupParser.java:257)
|
||||
at org.thymeleaf.templateparser.markup.AbstractMarkupTemplateParser.parse(AbstractMarkupTemplateParser.java:230)
|
||||
... 82 common frames omitted
|
||||
Caused by: org.thymeleaf.exceptions.TemplateProcessingException: Exception evaluating SpringEL expression: "rvo.target.id" (template: "/site/discuss-detail" - line 107, col 67)
|
||||
at org.thymeleaf.spring5.expression.SPELVariableExpressionEvaluator.evaluate(SPELVariableExpressionEvaluator.java:290)
|
||||
at org.thymeleaf.standard.expression.VariableExpression.executeVariableExpression(VariableExpression.java:166)
|
||||
at org.thymeleaf.standard.expression.SimpleExpression.executeSimple(SimpleExpression.java:66)
|
||||
at org.thymeleaf.standard.expression.Expression.execute(Expression.java:109)
|
||||
at org.thymeleaf.standard.expression.AdditionExpression.executeAddition(AdditionExpression.java:96)
|
||||
at org.thymeleaf.standard.expression.ComplexExpression.executeComplex(ComplexExpression.java:62)
|
||||
at org.thymeleaf.standard.expression.Expression.execute(Expression.java:112)
|
||||
at org.thymeleaf.standard.expression.Expression.execute(Expression.java:138)
|
||||
at org.thymeleaf.standard.expression.LinkExpression.executeLinkExpression(LinkExpression.java:264)
|
||||
at org.thymeleaf.standard.expression.SimpleExpression.executeSimple(SimpleExpression.java:85)
|
||||
at org.thymeleaf.standard.expression.Expression.execute(Expression.java:109)
|
||||
at org.thymeleaf.standard.expression.Expression.execute(Expression.java:138)
|
||||
at org.thymeleaf.standard.processor.AbstractStandardExpressionAttributeTagProcessor.doProcess(AbstractStandardExpressionAttributeTagProcessor.java:144)
|
||||
at org.thymeleaf.processor.element.AbstractAttributeTagProcessor.doProcess(AbstractAttributeTagProcessor.java:74)
|
||||
at org.thymeleaf.processor.element.AbstractElementTagProcessor.process(AbstractElementTagProcessor.java:95)
|
||||
at org.thymeleaf.util.ProcessorConfigurationUtils$ElementTagProcessorWrapper.process(ProcessorConfigurationUtils.java:633)
|
||||
at org.thymeleaf.engine.ProcessorTemplateHandler.handleOpenElement(ProcessorTemplateHandler.java:1314)
|
||||
at org.thymeleaf.engine.OpenElementTag.beHandled(OpenElementTag.java:205)
|
||||
at org.thymeleaf.engine.Model.process(Model.java:282)
|
||||
at org.thymeleaf.engine.Model.process(Model.java:290)
|
||||
at org.thymeleaf.engine.IteratedGatheringModelProcessable.processIterationModel(IteratedGatheringModelProcessable.java:367)
|
||||
at org.thymeleaf.engine.IteratedGatheringModelProcessable.process(IteratedGatheringModelProcessable.java:221)
|
||||
at org.thymeleaf.engine.ProcessorTemplateHandler.handleCloseElement(ProcessorTemplateHandler.java:1640)
|
||||
at org.thymeleaf.engine.CloseElementTag.beHandled(CloseElementTag.java:139)
|
||||
at org.thymeleaf.engine.Model.process(Model.java:282)
|
||||
at org.thymeleaf.engine.Model.process(Model.java:290)
|
||||
at org.thymeleaf.engine.IteratedGatheringModelProcessable.processIterationModel(IteratedGatheringModelProcessable.java:367)
|
||||
at org.thymeleaf.engine.IteratedGatheringModelProcessable.process(IteratedGatheringModelProcessable.java:221)
|
||||
at org.thymeleaf.engine.ProcessorTemplateHandler.handleCloseElement(ProcessorTemplateHandler.java:1640)
|
||||
at org.thymeleaf.engine.TemplateHandlerAdapterMarkupHandler.handleCloseElementEnd(TemplateHandlerAdapterMarkupHandler.java:388)
|
||||
at org.thymeleaf.templateparser.markup.InlinedOutputExpressionMarkupHandler$InlineMarkupAdapterPreProcessorHandler.handleCloseElementEnd(InlinedOutputExpressionMarkupHandler.java:322)
|
||||
at org.thymeleaf.standard.inline.OutputExpressionInlinePreProcessorHandler.handleCloseElementEnd(OutputExpressionInlinePreProcessorHandler.java:220)
|
||||
at org.thymeleaf.templateparser.markup.InlinedOutputExpressionMarkupHandler.handleCloseElementEnd(InlinedOutputExpressionMarkupHandler.java:164)
|
||||
at org.attoparser.HtmlElement.handleCloseElementEnd(HtmlElement.java:169)
|
||||
at org.attoparser.HtmlMarkupHandler.handleCloseElementEnd(HtmlMarkupHandler.java:412)
|
||||
at org.attoparser.MarkupEventProcessorHandler.handleCloseElementEnd(MarkupEventProcessorHandler.java:473)
|
||||
at org.attoparser.ParsingElementMarkupUtil.parseCloseElement(ParsingElementMarkupUtil.java:201)
|
||||
at org.attoparser.MarkupParser.parseBuffer(MarkupParser.java:725)
|
||||
at org.attoparser.MarkupParser.parseDocument(MarkupParser.java:301)
|
||||
... 84 common frames omitted
|
||||
Caused by: org.springframework.expression.spel.SpelEvaluationException: EL1007E: Property or field 'id' cannot be found on null
|
||||
at org.springframework.expression.spel.ast.PropertyOrFieldReference.readProperty(PropertyOrFieldReference.java:213)
|
||||
at org.springframework.expression.spel.ast.PropertyOrFieldReference.getValueInternal(PropertyOrFieldReference.java:104)
|
||||
at org.springframework.expression.spel.ast.PropertyOrFieldReference.access$000(PropertyOrFieldReference.java:51)
|
||||
at org.springframework.expression.spel.ast.PropertyOrFieldReference$AccessorLValue.getValue(PropertyOrFieldReference.java:406)
|
||||
at org.springframework.expression.spel.ast.CompoundExpression.getValueInternal(CompoundExpression.java:90)
|
||||
at org.springframework.expression.spel.ast.SpelNodeImpl.getValue(SpelNodeImpl.java:109)
|
||||
at org.springframework.expression.spel.standard.SpelExpression.getValue(SpelExpression.java:328)
|
||||
at org.thymeleaf.spring5.expression.SPELVariableExpressionEvaluator.evaluate(SPELVariableExpressionEvaluator.java:263)
|
||||
... 122 common frames omitted
|
||||
2021-02-07 15:00:47,335 ERROR [http-nio-8080-exec-1] o.a.c.c.C.[.[.[.[dispatcherServlet] [DirectJDKLog.java:175] Servlet.service() for servlet [dispatcherServlet] in context with path [/echo] threw exception [Request processing failed; nested exception is org.thymeleaf.exceptions.TemplateInputException: An error happened during template parsing (template: "class path resource [templates//site/discuss-detail.html]")] with root cause
|
||||
org.springframework.expression.spel.SpelEvaluationException: EL1007E: Property or field 'id' cannot be found on null
|
||||
at org.springframework.expression.spel.ast.PropertyOrFieldReference.readProperty(PropertyOrFieldReference.java:213)
|
||||
at org.springframework.expression.spel.ast.PropertyOrFieldReference.getValueInternal(PropertyOrFieldReference.java:104)
|
||||
at org.springframework.expression.spel.ast.PropertyOrFieldReference.access$000(PropertyOrFieldReference.java:51)
|
||||
at org.springframework.expression.spel.ast.PropertyOrFieldReference$AccessorLValue.getValue(PropertyOrFieldReference.java:406)
|
||||
at org.springframework.expression.spel.ast.CompoundExpression.getValueInternal(CompoundExpression.java:90)
|
||||
at org.springframework.expression.spel.ast.SpelNodeImpl.getValue(SpelNodeImpl.java:109)
|
||||
at org.springframework.expression.spel.standard.SpelExpression.getValue(SpelExpression.java:328)
|
||||
at org.thymeleaf.spring5.expression.SPELVariableExpressionEvaluator.evaluate(SPELVariableExpressionEvaluator.java:263)
|
||||
at org.thymeleaf.standard.expression.VariableExpression.executeVariableExpression(VariableExpression.java:166)
|
||||
at org.thymeleaf.standard.expression.SimpleExpression.executeSimple(SimpleExpression.java:66)
|
||||
at org.thymeleaf.standard.expression.Expression.execute(Expression.java:109)
|
||||
at org.thymeleaf.standard.expression.AdditionExpression.executeAddition(AdditionExpression.java:96)
|
||||
at org.thymeleaf.standard.expression.ComplexExpression.executeComplex(ComplexExpression.java:62)
|
||||
at org.thymeleaf.standard.expression.Expression.execute(Expression.java:112)
|
||||
at org.thymeleaf.standard.expression.Expression.execute(Expression.java:138)
|
||||
at org.thymeleaf.standard.expression.LinkExpression.executeLinkExpression(LinkExpression.java:264)
|
||||
at org.thymeleaf.standard.expression.SimpleExpression.executeSimple(SimpleExpression.java:85)
|
||||
at org.thymeleaf.standard.expression.Expression.execute(Expression.java:109)
|
||||
at org.thymeleaf.standard.expression.Expression.execute(Expression.java:138)
|
||||
at org.thymeleaf.standard.processor.AbstractStandardExpressionAttributeTagProcessor.doProcess(AbstractStandardExpressionAttributeTagProcessor.java:144)
|
||||
at org.thymeleaf.processor.element.AbstractAttributeTagProcessor.doProcess(AbstractAttributeTagProcessor.java:74)
|
||||
at org.thymeleaf.processor.element.AbstractElementTagProcessor.process(AbstractElementTagProcessor.java:95)
|
||||
at org.thymeleaf.util.ProcessorConfigurationUtils$ElementTagProcessorWrapper.process(ProcessorConfigurationUtils.java:633)
|
||||
at org.thymeleaf.engine.ProcessorTemplateHandler.handleOpenElement(ProcessorTemplateHandler.java:1314)
|
||||
at org.thymeleaf.engine.OpenElementTag.beHandled(OpenElementTag.java:205)
|
||||
at org.thymeleaf.engine.Model.process(Model.java:282)
|
||||
at org.thymeleaf.engine.Model.process(Model.java:290)
|
||||
at org.thymeleaf.engine.IteratedGatheringModelProcessable.processIterationModel(IteratedGatheringModelProcessable.java:367)
|
||||
at org.thymeleaf.engine.IteratedGatheringModelProcessable.process(IteratedGatheringModelProcessable.java:221)
|
||||
at org.thymeleaf.engine.ProcessorTemplateHandler.handleCloseElement(ProcessorTemplateHandler.java:1640)
|
||||
at org.thymeleaf.engine.CloseElementTag.beHandled(CloseElementTag.java:139)
|
||||
at org.thymeleaf.engine.Model.process(Model.java:282)
|
||||
at org.thymeleaf.engine.Model.process(Model.java:290)
|
||||
at org.thymeleaf.engine.IteratedGatheringModelProcessable.processIterationModel(IteratedGatheringModelProcessable.java:367)
|
||||
at org.thymeleaf.engine.IteratedGatheringModelProcessable.process(IteratedGatheringModelProcessable.java:221)
|
||||
at org.thymeleaf.engine.ProcessorTemplateHandler.handleCloseElement(ProcessorTemplateHandler.java:1640)
|
||||
at org.thymeleaf.engine.TemplateHandlerAdapterMarkupHandler.handleCloseElementEnd(TemplateHandlerAdapterMarkupHandler.java:388)
|
||||
at org.thymeleaf.templateparser.markup.InlinedOutputExpressionMarkupHandler$InlineMarkupAdapterPreProcessorHandler.handleCloseElementEnd(InlinedOutputExpressionMarkupHandler.java:322)
|
||||
at org.thymeleaf.standard.inline.OutputExpressionInlinePreProcessorHandler.handleCloseElementEnd(OutputExpressionInlinePreProcessorHandler.java:220)
|
||||
at org.thymeleaf.templateparser.markup.InlinedOutputExpressionMarkupHandler.handleCloseElementEnd(InlinedOutputExpressionMarkupHandler.java:164)
|
||||
at org.attoparser.HtmlElement.handleCloseElementEnd(HtmlElement.java:169)
|
||||
at org.attoparser.HtmlMarkupHandler.handleCloseElementEnd(HtmlMarkupHandler.java:412)
|
||||
at org.attoparser.MarkupEventProcessorHandler.handleCloseElementEnd(MarkupEventProcessorHandler.java:473)
|
||||
at org.attoparser.ParsingElementMarkupUtil.parseCloseElement(ParsingElementMarkupUtil.java:201)
|
||||
at org.attoparser.MarkupParser.parseBuffer(MarkupParser.java:725)
|
||||
at org.attoparser.MarkupParser.parseDocument(MarkupParser.java:301)
|
||||
at org.attoparser.MarkupParser.parse(MarkupParser.java:257)
|
||||
at org.thymeleaf.templateparser.markup.AbstractMarkupTemplateParser.parse(AbstractMarkupTemplateParser.java:230)
|
||||
at org.thymeleaf.templateparser.markup.AbstractMarkupTemplateParser.parseStandalone(AbstractMarkupTemplateParser.java:100)
|
||||
at org.thymeleaf.engine.TemplateManager.parseAndProcess(TemplateManager.java:666)
|
||||
at org.thymeleaf.TemplateEngine.process(TemplateEngine.java:1098)
|
||||
at org.thymeleaf.TemplateEngine.process(TemplateEngine.java:1072)
|
||||
at org.thymeleaf.spring5.view.ThymeleafView.renderFragment(ThymeleafView.java:362)
|
||||
at org.thymeleaf.spring5.view.ThymeleafView.render(ThymeleafView.java:189)
|
||||
at org.springframework.web.servlet.DispatcherServlet.render(DispatcherServlet.java:1371)
|
||||
at org.springframework.web.servlet.DispatcherServlet.processDispatchResult(DispatcherServlet.java:1117)
|
||||
at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:1056)
|
||||
at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:942)
|
||||
at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:1005)
|
||||
at org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:897)
|
||||
at javax.servlet.http.HttpServlet.service(HttpServlet.java:634)
|
||||
at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:882)
|
||||
at javax.servlet.http.HttpServlet.service(HttpServlet.java:741)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:231)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)
|
||||
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:53)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)
|
||||
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:320)
|
||||
at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:127)
|
||||
at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:91)
|
||||
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:334)
|
||||
at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:119)
|
||||
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:334)
|
||||
at org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:137)
|
||||
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:334)
|
||||
at org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:111)
|
||||
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:334)
|
||||
at org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:170)
|
||||
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:334)
|
||||
at org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:63)
|
||||
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:334)
|
||||
at org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:116)
|
||||
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:334)
|
||||
at org.springframework.security.web.header.HeaderWriterFilter.doFilterInternal(HeaderWriterFilter.java:74)
|
||||
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
|
||||
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:334)
|
||||
at org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:105)
|
||||
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:334)
|
||||
at org.springframework.security.web.context.request.async.WebAsyncManagerIntegrationFilter.doFilterInternal(WebAsyncManagerIntegrationFilter.java:56)
|
||||
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
|
||||
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:334)
|
||||
at org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:215)
|
||||
at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:178)
|
||||
at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:357)
|
||||
at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:270)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)
|
||||
at org.springframework.web.filter.RequestContextFilter.doFilterInternal(RequestContextFilter.java:99)
|
||||
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)
|
||||
at org.springframework.web.filter.FormContentFilter.doFilterInternal(FormContentFilter.java:92)
|
||||
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)
|
||||
at org.springframework.web.filter.HiddenHttpMethodFilter.doFilterInternal(HiddenHttpMethodFilter.java:93)
|
||||
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)
|
||||
at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:200)
|
||||
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)
|
||||
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:200)
|
||||
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:96)
|
||||
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:490)
|
||||
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:139)
|
||||
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:92)
|
||||
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:74)
|
||||
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:343)
|
||||
at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:408)
|
||||
at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:66)
|
||||
at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:836)
|
||||
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1747)
|
||||
at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:49)
|
||||
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
|
||||
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
|
||||
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
|
||||
at java.lang.Thread.run(Thread.java:748)
|
||||
2021-02-07 15:00:52,271 ERROR [http-nio-8080-exec-10] o.t.TemplateEngine [TemplateEngine.java:1136] [THYMELEAF][http-nio-8080-exec-10] Exception processing template "/site/discuss-detail": An error happened during template parsing (template: "class path resource [templates//site/discuss-detail.html]")
|
||||
org.thymeleaf.exceptions.TemplateInputException: An error happened during template parsing (template: "class path resource [templates//site/discuss-detail.html]")
|
||||
at org.thymeleaf.templateparser.markup.AbstractMarkupTemplateParser.parse(AbstractMarkupTemplateParser.java:241)
|
||||
at org.thymeleaf.templateparser.markup.AbstractMarkupTemplateParser.parseStandalone(AbstractMarkupTemplateParser.java:100)
|
||||
at org.thymeleaf.engine.TemplateManager.parseAndProcess(TemplateManager.java:666)
|
||||
at org.thymeleaf.TemplateEngine.process(TemplateEngine.java:1098)
|
||||
at org.thymeleaf.TemplateEngine.process(TemplateEngine.java:1072)
|
||||
at org.thymeleaf.spring5.view.ThymeleafView.renderFragment(ThymeleafView.java:362)
|
||||
at org.thymeleaf.spring5.view.ThymeleafView.render(ThymeleafView.java:189)
|
||||
at org.springframework.web.servlet.DispatcherServlet.render(DispatcherServlet.java:1371)
|
||||
at org.springframework.web.servlet.DispatcherServlet.processDispatchResult(DispatcherServlet.java:1117)
|
||||
at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:1056)
|
||||
at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:942)
|
||||
at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:1005)
|
||||
at org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:897)
|
||||
at javax.servlet.http.HttpServlet.service(HttpServlet.java:634)
|
||||
at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:882)
|
||||
at javax.servlet.http.HttpServlet.service(HttpServlet.java:741)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:231)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)
|
||||
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:53)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)
|
||||
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:320)
|
||||
at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:127)
|
||||
at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:91)
|
||||
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:334)
|
||||
at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:119)
|
||||
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:334)
|
||||
at org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:137)
|
||||
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:334)
|
||||
at org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:111)
|
||||
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:334)
|
||||
at org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:170)
|
||||
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:334)
|
||||
at org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:63)
|
||||
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:334)
|
||||
at org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:116)
|
||||
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:334)
|
||||
at org.springframework.security.web.header.HeaderWriterFilter.doFilterInternal(HeaderWriterFilter.java:74)
|
||||
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
|
||||
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:334)
|
||||
at org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:105)
|
||||
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:334)
|
||||
at org.springframework.security.web.context.request.async.WebAsyncManagerIntegrationFilter.doFilterInternal(WebAsyncManagerIntegrationFilter.java:56)
|
||||
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
|
||||
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:334)
|
||||
at org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:215)
|
||||
at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:178)
|
||||
at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:357)
|
||||
at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:270)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)
|
||||
at org.springframework.web.filter.RequestContextFilter.doFilterInternal(RequestContextFilter.java:99)
|
||||
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)
|
||||
at org.springframework.web.filter.FormContentFilter.doFilterInternal(FormContentFilter.java:92)
|
||||
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)
|
||||
at org.springframework.web.filter.HiddenHttpMethodFilter.doFilterInternal(HiddenHttpMethodFilter.java:93)
|
||||
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)
|
||||
at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:200)
|
||||
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)
|
||||
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:200)
|
||||
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:96)
|
||||
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:490)
|
||||
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:139)
|
||||
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:92)
|
||||
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:74)
|
||||
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:343)
|
||||
at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:408)
|
||||
at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:66)
|
||||
at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:836)
|
||||
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1747)
|
||||
at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:49)
|
||||
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
|
||||
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
|
||||
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
|
||||
at java.lang.Thread.run(Thread.java:748)
|
||||
Caused by: org.attoparser.ParseException: Exception evaluating SpringEL expression: "rvo.target.id" (template: "/site/discuss-detail" - line 107, col 67)
|
||||
at org.attoparser.MarkupParser.parseDocument(MarkupParser.java:393)
|
||||
at org.attoparser.MarkupParser.parse(MarkupParser.java:257)
|
||||
at org.thymeleaf.templateparser.markup.AbstractMarkupTemplateParser.parse(AbstractMarkupTemplateParser.java:230)
|
||||
... 82 common frames omitted
|
||||
Caused by: org.thymeleaf.exceptions.TemplateProcessingException: Exception evaluating SpringEL expression: "rvo.target.id" (template: "/site/discuss-detail" - line 107, col 67)
|
||||
at org.thymeleaf.spring5.expression.SPELVariableExpressionEvaluator.evaluate(SPELVariableExpressionEvaluator.java:290)
|
||||
at org.thymeleaf.standard.expression.VariableExpression.executeVariableExpression(VariableExpression.java:166)
|
||||
at org.thymeleaf.standard.expression.SimpleExpression.executeSimple(SimpleExpression.java:66)
|
||||
at org.thymeleaf.standard.expression.Expression.execute(Expression.java:109)
|
||||
at org.thymeleaf.standard.expression.AdditionExpression.executeAddition(AdditionExpression.java:96)
|
||||
at org.thymeleaf.standard.expression.ComplexExpression.executeComplex(ComplexExpression.java:62)
|
||||
at org.thymeleaf.standard.expression.Expression.execute(Expression.java:112)
|
||||
at org.thymeleaf.standard.expression.Expression.execute(Expression.java:138)
|
||||
at org.thymeleaf.standard.expression.LinkExpression.executeLinkExpression(LinkExpression.java:264)
|
||||
at org.thymeleaf.standard.expression.SimpleExpression.executeSimple(SimpleExpression.java:85)
|
||||
at org.thymeleaf.standard.expression.Expression.execute(Expression.java:109)
|
||||
at org.thymeleaf.standard.expression.Expression.execute(Expression.java:138)
|
||||
at org.thymeleaf.standard.processor.AbstractStandardExpressionAttributeTagProcessor.doProcess(AbstractStandardExpressionAttributeTagProcessor.java:144)
|
||||
at org.thymeleaf.processor.element.AbstractAttributeTagProcessor.doProcess(AbstractAttributeTagProcessor.java:74)
|
||||
at org.thymeleaf.processor.element.AbstractElementTagProcessor.process(AbstractElementTagProcessor.java:95)
|
||||
at org.thymeleaf.util.ProcessorConfigurationUtils$ElementTagProcessorWrapper.process(ProcessorConfigurationUtils.java:633)
|
||||
at org.thymeleaf.engine.ProcessorTemplateHandler.handleOpenElement(ProcessorTemplateHandler.java:1314)
|
||||
at org.thymeleaf.engine.OpenElementTag.beHandled(OpenElementTag.java:205)
|
||||
at org.thymeleaf.engine.Model.process(Model.java:282)
|
||||
at org.thymeleaf.engine.Model.process(Model.java:290)
|
||||
at org.thymeleaf.engine.IteratedGatheringModelProcessable.processIterationModel(IteratedGatheringModelProcessable.java:367)
|
||||
at org.thymeleaf.engine.IteratedGatheringModelProcessable.process(IteratedGatheringModelProcessable.java:221)
|
||||
at org.thymeleaf.engine.ProcessorTemplateHandler.handleCloseElement(ProcessorTemplateHandler.java:1640)
|
||||
at org.thymeleaf.engine.CloseElementTag.beHandled(CloseElementTag.java:139)
|
||||
at org.thymeleaf.engine.Model.process(Model.java:282)
|
||||
at org.thymeleaf.engine.Model.process(Model.java:290)
|
||||
at org.thymeleaf.engine.IteratedGatheringModelProcessable.processIterationModel(IteratedGatheringModelProcessable.java:367)
|
||||
at org.thymeleaf.engine.IteratedGatheringModelProcessable.process(IteratedGatheringModelProcessable.java:221)
|
||||
at org.thymeleaf.engine.ProcessorTemplateHandler.handleCloseElement(ProcessorTemplateHandler.java:1640)
|
||||
at org.thymeleaf.engine.TemplateHandlerAdapterMarkupHandler.handleCloseElementEnd(TemplateHandlerAdapterMarkupHandler.java:388)
|
||||
at org.thymeleaf.templateparser.markup.InlinedOutputExpressionMarkupHandler$InlineMarkupAdapterPreProcessorHandler.handleCloseElementEnd(InlinedOutputExpressionMarkupHandler.java:322)
|
||||
at org.thymeleaf.standard.inline.OutputExpressionInlinePreProcessorHandler.handleCloseElementEnd(OutputExpressionInlinePreProcessorHandler.java:220)
|
||||
at org.thymeleaf.templateparser.markup.InlinedOutputExpressionMarkupHandler.handleCloseElementEnd(InlinedOutputExpressionMarkupHandler.java:164)
|
||||
at org.attoparser.HtmlElement.handleCloseElementEnd(HtmlElement.java:169)
|
||||
at org.attoparser.HtmlMarkupHandler.handleCloseElementEnd(HtmlMarkupHandler.java:412)
|
||||
at org.attoparser.MarkupEventProcessorHandler.handleCloseElementEnd(MarkupEventProcessorHandler.java:473)
|
||||
at org.attoparser.ParsingElementMarkupUtil.parseCloseElement(ParsingElementMarkupUtil.java:201)
|
||||
at org.attoparser.MarkupParser.parseBuffer(MarkupParser.java:725)
|
||||
at org.attoparser.MarkupParser.parseDocument(MarkupParser.java:301)
|
||||
... 84 common frames omitted
|
||||
Caused by: org.springframework.expression.spel.SpelEvaluationException: EL1007E: Property or field 'id' cannot be found on null
|
||||
at org.springframework.expression.spel.ast.PropertyOrFieldReference.readProperty(PropertyOrFieldReference.java:213)
|
||||
at org.springframework.expression.spel.ast.PropertyOrFieldReference.getValueInternal(PropertyOrFieldReference.java:104)
|
||||
at org.springframework.expression.spel.ast.PropertyOrFieldReference.access$000(PropertyOrFieldReference.java:51)
|
||||
at org.springframework.expression.spel.ast.PropertyOrFieldReference$AccessorLValue.getValue(PropertyOrFieldReference.java:406)
|
||||
at org.springframework.expression.spel.ast.CompoundExpression.getValueInternal(CompoundExpression.java:90)
|
||||
at org.springframework.expression.spel.ast.SpelNodeImpl.getValue(SpelNodeImpl.java:109)
|
||||
at org.springframework.expression.spel.standard.SpelExpression.getValue(SpelExpression.java:328)
|
||||
at org.thymeleaf.spring5.expression.SPELVariableExpressionEvaluator.evaluate(SPELVariableExpressionEvaluator.java:263)
|
||||
... 122 common frames omitted
|
||||
2021-02-07 15:00:52,273 ERROR [http-nio-8080-exec-10] o.a.c.c.C.[.[.[.[dispatcherServlet] [DirectJDKLog.java:175] Servlet.service() for servlet [dispatcherServlet] in context with path [/echo] threw exception [Request processing failed; nested exception is org.thymeleaf.exceptions.TemplateInputException: An error happened during template parsing (template: "class path resource [templates//site/discuss-detail.html]")] with root cause
|
||||
org.springframework.expression.spel.SpelEvaluationException: EL1007E: Property or field 'id' cannot be found on null
|
||||
at org.springframework.expression.spel.ast.PropertyOrFieldReference.readProperty(PropertyOrFieldReference.java:213)
|
||||
at org.springframework.expression.spel.ast.PropertyOrFieldReference.getValueInternal(PropertyOrFieldReference.java:104)
|
||||
at org.springframework.expression.spel.ast.PropertyOrFieldReference.access$000(PropertyOrFieldReference.java:51)
|
||||
at org.springframework.expression.spel.ast.PropertyOrFieldReference$AccessorLValue.getValue(PropertyOrFieldReference.java:406)
|
||||
at org.springframework.expression.spel.ast.CompoundExpression.getValueInternal(CompoundExpression.java:90)
|
||||
at org.springframework.expression.spel.ast.SpelNodeImpl.getValue(SpelNodeImpl.java:109)
|
||||
at org.springframework.expression.spel.standard.SpelExpression.getValue(SpelExpression.java:328)
|
||||
at org.thymeleaf.spring5.expression.SPELVariableExpressionEvaluator.evaluate(SPELVariableExpressionEvaluator.java:263)
|
||||
at org.thymeleaf.standard.expression.VariableExpression.executeVariableExpression(VariableExpression.java:166)
|
||||
at org.thymeleaf.standard.expression.SimpleExpression.executeSimple(SimpleExpression.java:66)
|
||||
at org.thymeleaf.standard.expression.Expression.execute(Expression.java:109)
|
||||
at org.thymeleaf.standard.expression.AdditionExpression.executeAddition(AdditionExpression.java:96)
|
||||
at org.thymeleaf.standard.expression.ComplexExpression.executeComplex(ComplexExpression.java:62)
|
||||
at org.thymeleaf.standard.expression.Expression.execute(Expression.java:112)
|
||||
at org.thymeleaf.standard.expression.Expression.execute(Expression.java:138)
|
||||
at org.thymeleaf.standard.expression.LinkExpression.executeLinkExpression(LinkExpression.java:264)
|
||||
at org.thymeleaf.standard.expression.SimpleExpression.executeSimple(SimpleExpression.java:85)
|
||||
at org.thymeleaf.standard.expression.Expression.execute(Expression.java:109)
|
||||
at org.thymeleaf.standard.expression.Expression.execute(Expression.java:138)
|
||||
at org.thymeleaf.standard.processor.AbstractStandardExpressionAttributeTagProcessor.doProcess(AbstractStandardExpressionAttributeTagProcessor.java:144)
|
||||
at org.thymeleaf.processor.element.AbstractAttributeTagProcessor.doProcess(AbstractAttributeTagProcessor.java:74)
|
||||
at org.thymeleaf.processor.element.AbstractElementTagProcessor.process(AbstractElementTagProcessor.java:95)
|
||||
at org.thymeleaf.util.ProcessorConfigurationUtils$ElementTagProcessorWrapper.process(ProcessorConfigurationUtils.java:633)
|
||||
at org.thymeleaf.engine.ProcessorTemplateHandler.handleOpenElement(ProcessorTemplateHandler.java:1314)
|
||||
at org.thymeleaf.engine.OpenElementTag.beHandled(OpenElementTag.java:205)
|
||||
at org.thymeleaf.engine.Model.process(Model.java:282)
|
||||
at org.thymeleaf.engine.Model.process(Model.java:290)
|
||||
at org.thymeleaf.engine.IteratedGatheringModelProcessable.processIterationModel(IteratedGatheringModelProcessable.java:367)
|
||||
at org.thymeleaf.engine.IteratedGatheringModelProcessable.process(IteratedGatheringModelProcessable.java:221)
|
||||
at org.thymeleaf.engine.ProcessorTemplateHandler.handleCloseElement(ProcessorTemplateHandler.java:1640)
|
||||
at org.thymeleaf.engine.CloseElementTag.beHandled(CloseElementTag.java:139)
|
||||
at org.thymeleaf.engine.Model.process(Model.java:282)
|
||||
at org.thymeleaf.engine.Model.process(Model.java:290)
|
||||
at org.thymeleaf.engine.IteratedGatheringModelProcessable.processIterationModel(IteratedGatheringModelProcessable.java:367)
|
||||
at org.thymeleaf.engine.IteratedGatheringModelProcessable.process(IteratedGatheringModelProcessable.java:221)
|
||||
at org.thymeleaf.engine.ProcessorTemplateHandler.handleCloseElement(ProcessorTemplateHandler.java:1640)
|
||||
at org.thymeleaf.engine.TemplateHandlerAdapterMarkupHandler.handleCloseElementEnd(TemplateHandlerAdapterMarkupHandler.java:388)
|
||||
at org.thymeleaf.templateparser.markup.InlinedOutputExpressionMarkupHandler$InlineMarkupAdapterPreProcessorHandler.handleCloseElementEnd(InlinedOutputExpressionMarkupHandler.java:322)
|
||||
at org.thymeleaf.standard.inline.OutputExpressionInlinePreProcessorHandler.handleCloseElementEnd(OutputExpressionInlinePreProcessorHandler.java:220)
|
||||
at org.thymeleaf.templateparser.markup.InlinedOutputExpressionMarkupHandler.handleCloseElementEnd(InlinedOutputExpressionMarkupHandler.java:164)
|
||||
at org.attoparser.HtmlElement.handleCloseElementEnd(HtmlElement.java:169)
|
||||
at org.attoparser.HtmlMarkupHandler.handleCloseElementEnd(HtmlMarkupHandler.java:412)
|
||||
at org.attoparser.MarkupEventProcessorHandler.handleCloseElementEnd(MarkupEventProcessorHandler.java:473)
|
||||
at org.attoparser.ParsingElementMarkupUtil.parseCloseElement(ParsingElementMarkupUtil.java:201)
|
||||
at org.attoparser.MarkupParser.parseBuffer(MarkupParser.java:725)
|
||||
at org.attoparser.MarkupParser.parseDocument(MarkupParser.java:301)
|
||||
at org.attoparser.MarkupParser.parse(MarkupParser.java:257)
|
||||
at org.thymeleaf.templateparser.markup.AbstractMarkupTemplateParser.parse(AbstractMarkupTemplateParser.java:230)
|
||||
at org.thymeleaf.templateparser.markup.AbstractMarkupTemplateParser.parseStandalone(AbstractMarkupTemplateParser.java:100)
|
||||
at org.thymeleaf.engine.TemplateManager.parseAndProcess(TemplateManager.java:666)
|
||||
at org.thymeleaf.TemplateEngine.process(TemplateEngine.java:1098)
|
||||
at org.thymeleaf.TemplateEngine.process(TemplateEngine.java:1072)
|
||||
at org.thymeleaf.spring5.view.ThymeleafView.renderFragment(ThymeleafView.java:362)
|
||||
at org.thymeleaf.spring5.view.ThymeleafView.render(ThymeleafView.java:189)
|
||||
at org.springframework.web.servlet.DispatcherServlet.render(DispatcherServlet.java:1371)
|
||||
at org.springframework.web.servlet.DispatcherServlet.processDispatchResult(DispatcherServlet.java:1117)
|
||||
at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:1056)
|
||||
at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:942)
|
||||
at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:1005)
|
||||
at org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:897)
|
||||
at javax.servlet.http.HttpServlet.service(HttpServlet.java:634)
|
||||
at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:882)
|
||||
at javax.servlet.http.HttpServlet.service(HttpServlet.java:741)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:231)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)
|
||||
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:53)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)
|
||||
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:320)
|
||||
at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:127)
|
||||
at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:91)
|
||||
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:334)
|
||||
at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:119)
|
||||
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:334)
|
||||
at org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:137)
|
||||
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:334)
|
||||
at org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:111)
|
||||
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:334)
|
||||
at org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:170)
|
||||
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:334)
|
||||
at org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:63)
|
||||
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:334)
|
||||
at org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:116)
|
||||
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:334)
|
||||
at org.springframework.security.web.header.HeaderWriterFilter.doFilterInternal(HeaderWriterFilter.java:74)
|
||||
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
|
||||
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:334)
|
||||
at org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:105)
|
||||
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:334)
|
||||
at org.springframework.security.web.context.request.async.WebAsyncManagerIntegrationFilter.doFilterInternal(WebAsyncManagerIntegrationFilter.java:56)
|
||||
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
|
||||
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:334)
|
||||
at org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:215)
|
||||
at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:178)
|
||||
at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:357)
|
||||
at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:270)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)
|
||||
at org.springframework.web.filter.RequestContextFilter.doFilterInternal(RequestContextFilter.java:99)
|
||||
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)
|
||||
at org.springframework.web.filter.FormContentFilter.doFilterInternal(FormContentFilter.java:92)
|
||||
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)
|
||||
at org.springframework.web.filter.HiddenHttpMethodFilter.doFilterInternal(HiddenHttpMethodFilter.java:93)
|
||||
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)
|
||||
at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:200)
|
||||
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)
|
||||
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)
|
||||
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:200)
|
||||
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:96)
|
||||
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:490)
|
||||
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:139)
|
||||
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:92)
|
||||
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:74)
|
||||
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:343)
|
||||
at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:408)
|
||||
at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:66)
|
||||
at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:836)
|
||||
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1747)
|
||||
at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:49)
|
||||
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
|
||||
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
|
||||
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
|
||||
at java.lang.Thread.run(Thread.java:748)
|
12387
log/community/info/log-info-2021-01-31.0.log
Normal file
12387
log/community/info/log-info-2021-01-31.0.log
Normal file
File diff suppressed because it is too large
Load Diff
26186
log/community/info/log-info-2021-02-02.0.log
Normal file
26186
log/community/info/log-info-2021-02-02.0.log
Normal file
File diff suppressed because it is too large
Load Diff
25088
log/community/info/log-info-2021-02-02.1.log
Normal file
25088
log/community/info/log-info-2021-02-02.1.log
Normal file
File diff suppressed because it is too large
Load Diff
25088
log/community/info/log-info-2021-02-02.2.log
Normal file
25088
log/community/info/log-info-2021-02-02.2.log
Normal file
File diff suppressed because it is too large
Load Diff
25088
log/community/info/log-info-2021-02-02.3.log
Normal file
25088
log/community/info/log-info-2021-02-02.3.log
Normal file
File diff suppressed because it is too large
Load Diff
25088
log/community/info/log-info-2021-02-02.4.log
Normal file
25088
log/community/info/log-info-2021-02-02.4.log
Normal file
File diff suppressed because it is too large
Load Diff
25088
log/community/info/log-info-2021-02-02.5.log
Normal file
25088
log/community/info/log-info-2021-02-02.5.log
Normal file
File diff suppressed because it is too large
Load Diff
25088
log/community/info/log-info-2021-02-02.6.log
Normal file
25088
log/community/info/log-info-2021-02-02.6.log
Normal file
File diff suppressed because it is too large
Load Diff
25088
log/community/info/log-info-2021-02-02.7.log
Normal file
25088
log/community/info/log-info-2021-02-02.7.log
Normal file
File diff suppressed because it is too large
Load Diff
25088
log/community/info/log-info-2021-02-02.8.log
Normal file
25088
log/community/info/log-info-2021-02-02.8.log
Normal file
File diff suppressed because it is too large
Load Diff
10416
log/community/info/log-info-2021-02-02.9.log
Normal file
10416
log/community/info/log-info-2021-02-02.9.log
Normal file
File diff suppressed because it is too large
Load Diff
251
log/community/info/log-info-2021-02-03.0.log
Normal file
251
log/community/info/log-info-2021-02-03.0.log
Normal file
@ -0,0 +1,251 @@
|
||||
2021-02-03 18:20:01,226 INFO [main] c.g.c.CaffeineTest [StartupInfoLogger.java:50] Starting CaffeineTest on LAPTOP-5SJBI05C with PID 15776 (started by 19124 in E:\GreateCommunity)
|
||||
2021-02-03 18:20:01,231 INFO [main] c.g.c.CaffeineTest [SpringApplication.java:675] No active profile set, falling back to default profiles: default
|
||||
2021-02-03 18:20:01,971 INFO [main] o.s.d.r.c.RepositoryConfigurationDelegate [RepositoryConfigurationDelegate.java:244] Multiple Spring Data modules found, entering strict repository configuration mode!
|
||||
2021-02-03 18:20:01,974 INFO [main] o.s.d.r.c.RepositoryConfigurationDelegate [RepositoryConfigurationDelegate.java:126] Bootstrapping Spring Data repositories in DEFAULT mode.
|
||||
2021-02-03 18:20:02,162 INFO [main] o.s.d.r.c.RepositoryConfigurationDelegate [RepositoryConfigurationDelegate.java:182] Finished Spring Data repository scanning in 177ms. Found 1 repository interfaces.
|
||||
2021-02-03 18:20:02,174 INFO [main] o.s.d.r.c.RepositoryConfigurationDelegate [RepositoryConfigurationDelegate.java:244] Multiple Spring Data modules found, entering strict repository configuration mode!
|
||||
2021-02-03 18:20:02,176 INFO [main] o.s.d.r.c.RepositoryConfigurationDelegate [RepositoryConfigurationDelegate.java:126] Bootstrapping Spring Data repositories in DEFAULT mode.
|
||||
2021-02-03 18:20:02,196 INFO [main] o.s.d.r.c.RepositoryConfigurationExtensionSupport [RepositoryConfigurationExtensionSupport.java:363] Spring Data Redis - Could not safely identify store assignment for repository candidate interface com.greate.community.dao.elasticsearch.DiscussPostRepository.
|
||||
2021-02-03 18:20:02,196 INFO [main] o.s.d.r.c.RepositoryConfigurationDelegate [RepositoryConfigurationDelegate.java:182] Finished Spring Data repository scanning in 12ms. Found 0 repository interfaces.
|
||||
2021-02-03 18:20:02,590 INFO [main] c.u.j.c.EnableEncryptablePropertiesBeanFactoryPostProcessor [EnableEncryptablePropertiesBeanFactoryPostProcessor.java:48] Post-processing PropertySource instances
|
||||
2021-02-03 18:20:02,650 INFO [main] c.u.j.EncryptablePropertySourceConverter [EncryptablePropertySourceConverter.java:41] Converting PropertySource configurationProperties [org.springframework.boot.context.properties.source.ConfigurationPropertySourcesPropertySource] to AOP Proxy
|
||||
2021-02-03 18:20:02,651 INFO [main] c.u.j.EncryptablePropertySourceConverter [EncryptablePropertySourceConverter.java:41] Converting PropertySource Inlined Test Properties [org.springframework.core.env.MapPropertySource] to EncryptableMapPropertySourceWrapper
|
||||
2021-02-03 18:20:02,651 INFO [main] c.u.j.EncryptablePropertySourceConverter [EncryptablePropertySourceConverter.java:41] Converting PropertySource systemProperties [org.springframework.core.env.MapPropertySource] to EncryptableMapPropertySourceWrapper
|
||||
2021-02-03 18:20:02,651 INFO [main] c.u.j.EncryptablePropertySourceConverter [EncryptablePropertySourceConverter.java:41] Converting PropertySource systemEnvironment [org.springframework.boot.env.SystemEnvironmentPropertySourceEnvironmentPostProcessor$OriginAwareSystemEnvironmentPropertySource] to EncryptableSystemEnvironmentPropertySourceWrapper
|
||||
2021-02-03 18:20:02,651 INFO [main] c.u.j.EncryptablePropertySourceConverter [EncryptablePropertySourceConverter.java:41] Converting PropertySource random [org.springframework.boot.env.RandomValuePropertySource] to EncryptablePropertySourceWrapper
|
||||
2021-02-03 18:20:02,651 INFO [main] c.u.j.EncryptablePropertySourceConverter [EncryptablePropertySourceConverter.java:41] Converting PropertySource applicationConfig: [classpath:/application.properties] [org.springframework.boot.env.OriginTrackedMapPropertySource] to EncryptableMapPropertySourceWrapper
|
||||
2021-02-03 18:20:02,677 INFO [main] c.u.j.f.DefaultLazyPropertyFilter [DefaultLazyPropertyFilter.java:34] Property Filter custom Bean not found with name 'encryptablePropertyFilter'. Initializing Default Property Filter
|
||||
2021-02-03 18:20:02,718 INFO [main] o.s.c.s.PostProcessorRegistrationDelegate$BeanPostProcessorChecker [PostProcessorRegistrationDelegate.java:330] Bean 'org.springframework.kafka.annotation.KafkaBootstrapConfiguration' of type [org.springframework.kafka.annotation.KafkaBootstrapConfiguration$$EnhancerBySpringCGLIB$$57750304] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying)
|
||||
2021-02-03 18:20:02,760 INFO [main] o.s.c.s.PostProcessorRegistrationDelegate$BeanPostProcessorChecker [PostProcessorRegistrationDelegate.java:330] Bean 'org.springframework.transaction.annotation.ProxyTransactionManagementConfiguration' of type [org.springframework.transaction.annotation.ProxyTransactionManagementConfiguration$$EnhancerBySpringCGLIB$$83c2f181] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying)
|
||||
2021-02-03 18:20:02,939 INFO [main] c.u.j.r.DefaultLazyPropertyResolver [DefaultLazyPropertyResolver.java:35] Property Resolver custom Bean not found with name 'encryptablePropertyResolver'. Initializing Default Property Resolver
|
||||
2021-02-03 18:20:02,942 INFO [main] c.u.j.d.DefaultLazyPropertyDetector [DefaultLazyPropertyDetector.java:33] Property Detector custom Bean not found with name 'encryptablePropertyDetector'. Initializing Default Property Detector
|
||||
2021-02-03 18:20:04,696 INFO [main] o.e.p.PluginsService [PluginsService.java:190] no modules loaded
|
||||
2021-02-03 18:20:04,698 INFO [main] o.e.p.PluginsService [PluginsService.java:193] loaded plugin [org.elasticsearch.index.reindex.ReindexPlugin]
|
||||
2021-02-03 18:20:04,699 INFO [main] o.e.p.PluginsService [PluginsService.java:193] loaded plugin [org.elasticsearch.join.ParentJoinPlugin]
|
||||
2021-02-03 18:20:04,699 INFO [main] o.e.p.PluginsService [PluginsService.java:193] loaded plugin [org.elasticsearch.percolator.PercolatorPlugin]
|
||||
2021-02-03 18:20:04,700 INFO [main] o.e.p.PluginsService [PluginsService.java:193] loaded plugin [org.elasticsearch.script.mustache.MustachePlugin]
|
||||
2021-02-03 18:20:04,700 INFO [main] o.e.p.PluginsService [PluginsService.java:193] loaded plugin [org.elasticsearch.transport.Netty4Plugin]
|
||||
2021-02-03 18:20:08,507 INFO [main] o.s.d.e.c.TransportClientFactoryBean [TransportClientFactoryBean.java:88] Adding transport node : 127.0.0.1:9300
|
||||
2021-02-03 18:20:12,059 INFO [main] o.s.s.c.ThreadPoolTaskExecutor [ExecutorConfigurationSupport.java:171] Initializing ExecutorService 'applicationTaskExecutor'
|
||||
2021-02-03 18:20:12,243 INFO [main] o.s.b.a.w.s.WelcomePageHandlerMapping [WelcomePageHandlerMapping.java:61] Adding welcome page template: index
|
||||
2021-02-03 18:20:13,348 INFO [main] c.z.h.HikariDataSource [HikariDataSource.java:110] HikariPool-1 - Starting...
|
||||
2021-02-03 18:20:13,628 INFO [main] c.z.h.HikariDataSource [HikariDataSource.java:123] HikariPool-1 - Start completed.
|
||||
2021-02-03 18:20:13,754 INFO [main] o.q.i.StdSchedulerFactory [StdSchedulerFactory.java:1208] Using default implementation for ThreadExecutor
|
||||
2021-02-03 18:20:13,776 INFO [main] o.q.c.SchedulerSignalerImpl [SchedulerSignalerImpl.java:61] Initialized Scheduler Signaller of type: class org.quartz.core.SchedulerSignalerImpl
|
||||
2021-02-03 18:20:13,776 INFO [main] o.q.c.QuartzScheduler [QuartzScheduler.java:229] Quartz Scheduler v.2.3.1 created.
|
||||
2021-02-03 18:20:13,783 INFO [main] o.s.s.q.LocalDataSourceJobStore [JobStoreSupport.java:672] Using db table-based data access locking (synchronization).
|
||||
2021-02-03 18:20:13,787 INFO [main] o.s.s.q.LocalDataSourceJobStore [JobStoreCMT.java:145] JobStoreCMT initialized.
|
||||
2021-02-03 18:20:13,788 INFO [main] o.q.c.QuartzScheduler [QuartzScheduler.java:294] Scheduler meta-data: Quartz Scheduler (v2.3.1) 'communityScheduler' with instanceId 'LAPTOP-5SJBI05C1612347613757'
|
||||
Scheduler class: 'org.quartz.core.QuartzScheduler' - running locally.
|
||||
NOT STARTED.
|
||||
Currently in standby mode.
|
||||
Number of jobs executed: 0
|
||||
Using thread pool 'org.quartz.simpl.SimpleThreadPool' - with 5 threads.
|
||||
Using job-store 'org.springframework.scheduling.quartz.LocalDataSourceJobStore' - which supports persistence. and is clustered.
|
||||
|
||||
2021-02-03 18:20:13,788 INFO [main] o.q.i.StdSchedulerFactory [StdSchedulerFactory.java:1362] Quartz scheduler 'communityScheduler' initialized from an externally provided properties instance.
|
||||
2021-02-03 18:20:13,789 INFO [main] o.q.i.StdSchedulerFactory [StdSchedulerFactory.java:1366] Quartz scheduler version: 2.3.1
|
||||
2021-02-03 18:20:13,789 INFO [main] o.q.c.QuartzScheduler [QuartzScheduler.java:2293] JobFactory set to: org.springframework.scheduling.quartz.SpringBeanJobFactory@10820978
|
||||
2021-02-03 18:20:14,014 INFO [main] o.s.b.a.s.s.UserDetailsServiceAutoConfiguration [UserDetailsServiceAutoConfiguration.java:87]
|
||||
|
||||
Using generated security password: 88bf7845-bd8a-4a40-8d20-334d9bb569bb
|
||||
|
||||
2021-02-03 18:20:14,134 INFO [main] o.s.s.w.DefaultSecurityFilterChain [DefaultSecurityFilterChain.java:43] Creating filter chain: Ant [pattern='/resources/**'], []
|
||||
2021-02-03 18:20:14,272 INFO [main] o.s.s.w.DefaultSecurityFilterChain [DefaultSecurityFilterChain.java:43] Creating filter chain: any request, [org.springframework.security.web.context.request.async.WebAsyncManagerIntegrationFilter@192ecf8, org.springframework.security.web.context.SecurityContextPersistenceFilter@4d9bccfe, org.springframework.security.web.header.HeaderWriterFilter@3b29d36c, org.springframework.security.web.authentication.logout.LogoutFilter@26728255, org.springframework.security.web.savedrequest.RequestCacheAwareFilter@289cf7db, org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter@7607340f, org.springframework.security.web.authentication.AnonymousAuthenticationFilter@263e512e, org.springframework.security.web.session.SessionManagementFilter@1b82f62a, org.springframework.security.web.access.ExceptionTranslationFilter@794f937a, org.springframework.security.web.access.intercept.FilterSecurityInterceptor@4a7c72af]
|
||||
2021-02-03 18:20:14,486 INFO [main] o.a.k.c.c.ConsumerConfig [AbstractConfig.java:279] ConsumerConfig values:
|
||||
auto.commit.interval.ms = 3000
|
||||
auto.offset.reset = latest
|
||||
bootstrap.servers = [localhost:9092]
|
||||
check.crcs = true
|
||||
client.id =
|
||||
connections.max.idle.ms = 540000
|
||||
default.api.timeout.ms = 60000
|
||||
enable.auto.commit = true
|
||||
exclude.internal.topics = true
|
||||
fetch.max.bytes = 52428800
|
||||
fetch.max.wait.ms = 500
|
||||
fetch.min.bytes = 1
|
||||
group.id = test-consumer-group
|
||||
heartbeat.interval.ms = 3000
|
||||
interceptor.classes = []
|
||||
internal.leave.group.on.close = true
|
||||
isolation.level = read_uncommitted
|
||||
key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer
|
||||
max.partition.fetch.bytes = 1048576
|
||||
max.poll.interval.ms = 300000
|
||||
max.poll.records = 500
|
||||
metadata.max.age.ms = 300000
|
||||
metric.reporters = []
|
||||
metrics.num.samples = 2
|
||||
metrics.recording.level = INFO
|
||||
metrics.sample.window.ms = 30000
|
||||
partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor]
|
||||
receive.buffer.bytes = 65536
|
||||
reconnect.backoff.max.ms = 1000
|
||||
reconnect.backoff.ms = 50
|
||||
request.timeout.ms = 30000
|
||||
retry.backoff.ms = 100
|
||||
sasl.client.callback.handler.class = null
|
||||
sasl.jaas.config = null
|
||||
sasl.kerberos.kinit.cmd = /usr/bin/kinit
|
||||
sasl.kerberos.min.time.before.relogin = 60000
|
||||
sasl.kerberos.service.name = null
|
||||
sasl.kerberos.ticket.renew.jitter = 0.05
|
||||
sasl.kerberos.ticket.renew.window.factor = 0.8
|
||||
sasl.login.callback.handler.class = null
|
||||
sasl.login.class = null
|
||||
sasl.login.refresh.buffer.seconds = 300
|
||||
sasl.login.refresh.min.period.seconds = 60
|
||||
sasl.login.refresh.window.factor = 0.8
|
||||
sasl.login.refresh.window.jitter = 0.05
|
||||
sasl.mechanism = GSSAPI
|
||||
security.protocol = PLAINTEXT
|
||||
send.buffer.bytes = 131072
|
||||
session.timeout.ms = 10000
|
||||
ssl.cipher.suites = null
|
||||
ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
|
||||
ssl.endpoint.identification.algorithm = https
|
||||
ssl.key.password = null
|
||||
ssl.keymanager.algorithm = SunX509
|
||||
ssl.keystore.location = null
|
||||
ssl.keystore.password = null
|
||||
ssl.keystore.type = JKS
|
||||
ssl.protocol = TLS
|
||||
ssl.provider = null
|
||||
ssl.secure.random.implementation = null
|
||||
ssl.trustmanager.algorithm = PKIX
|
||||
ssl.truststore.location = null
|
||||
ssl.truststore.password = null
|
||||
ssl.truststore.type = JKS
|
||||
value.deserializer = class org.apache.kafka.common.serialization.StringDeserializer
|
||||
|
||||
2021-02-03 18:20:14,596 INFO [main] o.a.k.c.u.AppInfoParser [AppInfoParser.java:109] Kafka version : 2.0.1
|
||||
2021-02-03 18:20:14,596 INFO [main] o.a.k.c.u.AppInfoParser [AppInfoParser.java:110] Kafka commitId : fa14705e51bd2ce5
|
||||
2021-02-03 18:21:14,761 INFO [main] o.s.s.q.SchedulerFactoryBean [SchedulerFactoryBean.java:844] Shutting down Quartz Scheduler
|
||||
2021-02-03 18:21:14,762 INFO [main] o.q.c.QuartzScheduler [QuartzScheduler.java:666] Scheduler communityScheduler_$_LAPTOP-5SJBI05C1612347613757 shutting down.
|
||||
2021-02-03 18:21:14,762 INFO [main] o.q.c.QuartzScheduler [QuartzScheduler.java:585] Scheduler communityScheduler_$_LAPTOP-5SJBI05C1612347613757 paused.
|
||||
2021-02-03 18:21:14,762 INFO [main] o.q.c.QuartzScheduler [QuartzScheduler.java:740] Scheduler communityScheduler_$_LAPTOP-5SJBI05C1612347613757 shutdown complete.
|
||||
2021-02-03 18:21:14,768 INFO [main] o.s.s.c.ThreadPoolTaskExecutor [ExecutorConfigurationSupport.java:208] Shutting down ExecutorService 'applicationTaskExecutor'
|
||||
2021-02-03 18:21:17,842 INFO [main] c.g.c.CaffeineTest [StartupInfoLogger.java:50] Starting CaffeineTest on LAPTOP-5SJBI05C with PID 20856 (started by 19124 in E:\GreateCommunity)
|
||||
2021-02-03 18:21:17,843 INFO [main] c.g.c.CaffeineTest [SpringApplication.java:675] No active profile set, falling back to default profiles: default
|
||||
2021-02-03 18:21:18,859 INFO [main] o.s.d.r.c.RepositoryConfigurationDelegate [RepositoryConfigurationDelegate.java:244] Multiple Spring Data modules found, entering strict repository configuration mode!
|
||||
2021-02-03 18:21:18,862 INFO [main] o.s.d.r.c.RepositoryConfigurationDelegate [RepositoryConfigurationDelegate.java:126] Bootstrapping Spring Data repositories in DEFAULT mode.
|
||||
2021-02-03 18:21:19,069 INFO [main] o.s.d.r.c.RepositoryConfigurationDelegate [RepositoryConfigurationDelegate.java:182] Finished Spring Data repository scanning in 199ms. Found 1 repository interfaces.
|
||||
2021-02-03 18:21:19,081 INFO [main] o.s.d.r.c.RepositoryConfigurationDelegate [RepositoryConfigurationDelegate.java:244] Multiple Spring Data modules found, entering strict repository configuration mode!
|
||||
2021-02-03 18:21:19,082 INFO [main] o.s.d.r.c.RepositoryConfigurationDelegate [RepositoryConfigurationDelegate.java:126] Bootstrapping Spring Data repositories in DEFAULT mode.
|
||||
2021-02-03 18:21:19,101 INFO [main] o.s.d.r.c.RepositoryConfigurationExtensionSupport [RepositoryConfigurationExtensionSupport.java:363] Spring Data Redis - Could not safely identify store assignment for repository candidate interface com.greate.community.dao.elasticsearch.DiscussPostRepository.
|
||||
2021-02-03 18:21:19,102 INFO [main] o.s.d.r.c.RepositoryConfigurationDelegate [RepositoryConfigurationDelegate.java:182] Finished Spring Data repository scanning in 12ms. Found 0 repository interfaces.
|
||||
2021-02-03 18:21:19,522 INFO [main] c.u.j.c.EnableEncryptablePropertiesBeanFactoryPostProcessor [EnableEncryptablePropertiesBeanFactoryPostProcessor.java:48] Post-processing PropertySource instances
|
||||
2021-02-03 18:21:19,640 INFO [main] c.u.j.EncryptablePropertySourceConverter [EncryptablePropertySourceConverter.java:41] Converting PropertySource configurationProperties [org.springframework.boot.context.properties.source.ConfigurationPropertySourcesPropertySource] to AOP Proxy
|
||||
2021-02-03 18:21:19,640 INFO [main] c.u.j.EncryptablePropertySourceConverter [EncryptablePropertySourceConverter.java:41] Converting PropertySource Inlined Test Properties [org.springframework.core.env.MapPropertySource] to EncryptableMapPropertySourceWrapper
|
||||
2021-02-03 18:21:19,640 INFO [main] c.u.j.EncryptablePropertySourceConverter [EncryptablePropertySourceConverter.java:41] Converting PropertySource systemProperties [org.springframework.core.env.MapPropertySource] to EncryptableMapPropertySourceWrapper
|
||||
2021-02-03 18:21:19,641 INFO [main] c.u.j.EncryptablePropertySourceConverter [EncryptablePropertySourceConverter.java:41] Converting PropertySource systemEnvironment [org.springframework.boot.env.SystemEnvironmentPropertySourceEnvironmentPostProcessor$OriginAwareSystemEnvironmentPropertySource] to EncryptableSystemEnvironmentPropertySourceWrapper
|
||||
2021-02-03 18:21:19,641 INFO [main] c.u.j.EncryptablePropertySourceConverter [EncryptablePropertySourceConverter.java:41] Converting PropertySource random [org.springframework.boot.env.RandomValuePropertySource] to EncryptablePropertySourceWrapper
|
||||
2021-02-03 18:21:19,641 INFO [main] c.u.j.EncryptablePropertySourceConverter [EncryptablePropertySourceConverter.java:41] Converting PropertySource applicationConfig: [classpath:/application.properties] [org.springframework.boot.env.OriginTrackedMapPropertySource] to EncryptableMapPropertySourceWrapper
|
||||
2021-02-03 18:21:19,676 INFO [main] c.u.j.f.DefaultLazyPropertyFilter [DefaultLazyPropertyFilter.java:34] Property Filter custom Bean not found with name 'encryptablePropertyFilter'. Initializing Default Property Filter
|
||||
2021-02-03 18:21:19,734 INFO [main] o.s.c.s.PostProcessorRegistrationDelegate$BeanPostProcessorChecker [PostProcessorRegistrationDelegate.java:330] Bean 'org.springframework.kafka.annotation.KafkaBootstrapConfiguration' of type [org.springframework.kafka.annotation.KafkaBootstrapConfiguration$$EnhancerBySpringCGLIB$$f02ba0b0] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying)
|
||||
2021-02-03 18:21:19,793 INFO [main] o.s.c.s.PostProcessorRegistrationDelegate$BeanPostProcessorChecker [PostProcessorRegistrationDelegate.java:330] Bean 'org.springframework.transaction.annotation.ProxyTransactionManagementConfiguration' of type [org.springframework.transaction.annotation.ProxyTransactionManagementConfiguration$$EnhancerBySpringCGLIB$$1c798f2d] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying)
|
||||
2021-02-03 18:21:19,994 INFO [main] c.u.j.r.DefaultLazyPropertyResolver [DefaultLazyPropertyResolver.java:35] Property Resolver custom Bean not found with name 'encryptablePropertyResolver'. Initializing Default Property Resolver
|
||||
2021-02-03 18:21:19,997 INFO [main] c.u.j.d.DefaultLazyPropertyDetector [DefaultLazyPropertyDetector.java:33] Property Detector custom Bean not found with name 'encryptablePropertyDetector'. Initializing Default Property Detector
|
||||
2021-02-03 18:21:21,947 INFO [main] o.e.p.PluginsService [PluginsService.java:190] no modules loaded
|
||||
2021-02-03 18:21:21,949 INFO [main] o.e.p.PluginsService [PluginsService.java:193] loaded plugin [org.elasticsearch.index.reindex.ReindexPlugin]
|
||||
2021-02-03 18:21:21,949 INFO [main] o.e.p.PluginsService [PluginsService.java:193] loaded plugin [org.elasticsearch.join.ParentJoinPlugin]
|
||||
2021-02-03 18:21:21,949 INFO [main] o.e.p.PluginsService [PluginsService.java:193] loaded plugin [org.elasticsearch.percolator.PercolatorPlugin]
|
||||
2021-02-03 18:21:21,950 INFO [main] o.e.p.PluginsService [PluginsService.java:193] loaded plugin [org.elasticsearch.script.mustache.MustachePlugin]
|
||||
2021-02-03 18:21:21,950 INFO [main] o.e.p.PluginsService [PluginsService.java:193] loaded plugin [org.elasticsearch.transport.Netty4Plugin]
|
||||
2021-02-03 18:21:24,084 INFO [main] o.s.d.e.c.TransportClientFactoryBean [TransportClientFactoryBean.java:88] Adding transport node : 127.0.0.1:9300
|
||||
2021-02-03 18:21:27,265 INFO [main] o.s.s.c.ThreadPoolTaskExecutor [ExecutorConfigurationSupport.java:171] Initializing ExecutorService 'applicationTaskExecutor'
|
||||
2021-02-03 18:21:27,442 INFO [main] o.s.b.a.w.s.WelcomePageHandlerMapping [WelcomePageHandlerMapping.java:61] Adding welcome page template: index
|
||||
2021-02-03 18:21:28,595 INFO [main] c.z.h.HikariDataSource [HikariDataSource.java:110] HikariPool-1 - Starting...
|
||||
2021-02-03 18:21:28,886 INFO [main] c.z.h.HikariDataSource [HikariDataSource.java:123] HikariPool-1 - Start completed.
|
||||
2021-02-03 18:21:29,037 INFO [main] o.q.i.StdSchedulerFactory [StdSchedulerFactory.java:1208] Using default implementation for ThreadExecutor
|
||||
2021-02-03 18:21:29,058 INFO [main] o.q.c.SchedulerSignalerImpl [SchedulerSignalerImpl.java:61] Initialized Scheduler Signaller of type: class org.quartz.core.SchedulerSignalerImpl
|
||||
2021-02-03 18:21:29,059 INFO [main] o.q.c.QuartzScheduler [QuartzScheduler.java:229] Quartz Scheduler v.2.3.1 created.
|
||||
2021-02-03 18:21:29,065 INFO [main] o.s.s.q.LocalDataSourceJobStore [JobStoreSupport.java:672] Using db table-based data access locking (synchronization).
|
||||
2021-02-03 18:21:29,069 INFO [main] o.s.s.q.LocalDataSourceJobStore [JobStoreCMT.java:145] JobStoreCMT initialized.
|
||||
2021-02-03 18:21:29,070 INFO [main] o.q.c.QuartzScheduler [QuartzScheduler.java:294] Scheduler meta-data: Quartz Scheduler (v2.3.1) 'communityScheduler' with instanceId 'LAPTOP-5SJBI05C1612347689041'
|
||||
Scheduler class: 'org.quartz.core.QuartzScheduler' - running locally.
|
||||
NOT STARTED.
|
||||
Currently in standby mode.
|
||||
Number of jobs executed: 0
|
||||
Using thread pool 'org.quartz.simpl.SimpleThreadPool' - with 5 threads.
|
||||
Using job-store 'org.springframework.scheduling.quartz.LocalDataSourceJobStore' - which supports persistence. and is clustered.
|
||||
|
||||
2021-02-03 18:21:29,071 INFO [main] o.q.i.StdSchedulerFactory [StdSchedulerFactory.java:1362] Quartz scheduler 'communityScheduler' initialized from an externally provided properties instance.
|
||||
2021-02-03 18:21:29,071 INFO [main] o.q.i.StdSchedulerFactory [StdSchedulerFactory.java:1366] Quartz scheduler version: 2.3.1
|
||||
2021-02-03 18:21:29,071 INFO [main] o.q.c.QuartzScheduler [QuartzScheduler.java:2293] JobFactory set to: org.springframework.scheduling.quartz.SpringBeanJobFactory@226e95e9
|
||||
2021-02-03 18:21:29,298 INFO [main] o.s.b.a.s.s.UserDetailsServiceAutoConfiguration [UserDetailsServiceAutoConfiguration.java:87]
|
||||
|
||||
Using generated security password: cf31059b-5f27-476b-bb5b-3bd75df2e589
|
||||
|
||||
2021-02-03 18:21:29,404 INFO [main] o.s.s.w.DefaultSecurityFilterChain [DefaultSecurityFilterChain.java:43] Creating filter chain: Ant [pattern='/resources/**'], []
|
||||
2021-02-03 18:21:29,533 INFO [main] o.s.s.w.DefaultSecurityFilterChain [DefaultSecurityFilterChain.java:43] Creating filter chain: any request, [org.springframework.security.web.context.request.async.WebAsyncManagerIntegrationFilter@21d48c40, org.springframework.security.web.context.SecurityContextPersistenceFilter@626b9092, org.springframework.security.web.header.HeaderWriterFilter@544300a6, org.springframework.security.web.authentication.logout.LogoutFilter@289cf7db, org.springframework.security.web.savedrequest.RequestCacheAwareFilter@4e25282d, org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter@56820446, org.springframework.security.web.authentication.AnonymousAuthenticationFilter@4a10c019, org.springframework.security.web.session.SessionManagementFilter@152dbf8e, org.springframework.security.web.access.ExceptionTranslationFilter@20e9c165, org.springframework.security.web.access.intercept.FilterSecurityInterceptor@6761f75b]
|
||||
2021-02-03 18:21:29,766 INFO [main] o.a.k.c.c.ConsumerConfig [AbstractConfig.java:279] ConsumerConfig values:
|
||||
auto.commit.interval.ms = 3000
|
||||
auto.offset.reset = latest
|
||||
bootstrap.servers = [localhost:9092]
|
||||
check.crcs = true
|
||||
client.id =
|
||||
connections.max.idle.ms = 540000
|
||||
default.api.timeout.ms = 60000
|
||||
enable.auto.commit = true
|
||||
exclude.internal.topics = true
|
||||
fetch.max.bytes = 52428800
|
||||
fetch.max.wait.ms = 500
|
||||
fetch.min.bytes = 1
|
||||
group.id = test-consumer-group
|
||||
heartbeat.interval.ms = 3000
|
||||
interceptor.classes = []
|
||||
internal.leave.group.on.close = true
|
||||
isolation.level = read_uncommitted
|
||||
key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer
|
||||
max.partition.fetch.bytes = 1048576
|
||||
max.poll.interval.ms = 300000
|
||||
max.poll.records = 500
|
||||
metadata.max.age.ms = 300000
|
||||
metric.reporters = []
|
||||
metrics.num.samples = 2
|
||||
metrics.recording.level = INFO
|
||||
metrics.sample.window.ms = 30000
|
||||
partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor]
|
||||
receive.buffer.bytes = 65536
|
||||
reconnect.backoff.max.ms = 1000
|
||||
reconnect.backoff.ms = 50
|
||||
request.timeout.ms = 30000
|
||||
retry.backoff.ms = 100
|
||||
sasl.client.callback.handler.class = null
|
||||
sasl.jaas.config = null
|
||||
sasl.kerberos.kinit.cmd = /usr/bin/kinit
|
||||
sasl.kerberos.min.time.before.relogin = 60000
|
||||
sasl.kerberos.service.name = null
|
||||
sasl.kerberos.ticket.renew.jitter = 0.05
|
||||
sasl.kerberos.ticket.renew.window.factor = 0.8
|
||||
sasl.login.callback.handler.class = null
|
||||
sasl.login.class = null
|
||||
sasl.login.refresh.buffer.seconds = 300
|
||||
sasl.login.refresh.min.period.seconds = 60
|
||||
sasl.login.refresh.window.factor = 0.8
|
||||
sasl.login.refresh.window.jitter = 0.05
|
||||
sasl.mechanism = GSSAPI
|
||||
security.protocol = PLAINTEXT
|
||||
send.buffer.bytes = 131072
|
||||
session.timeout.ms = 10000
|
||||
ssl.cipher.suites = null
|
||||
ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
|
||||
ssl.endpoint.identification.algorithm = https
|
||||
ssl.key.password = null
|
||||
ssl.keymanager.algorithm = SunX509
|
||||
ssl.keystore.location = null
|
||||
ssl.keystore.password = null
|
||||
ssl.keystore.type = JKS
|
||||
ssl.protocol = TLS
|
||||
ssl.provider = null
|
||||
ssl.secure.random.implementation = null
|
||||
ssl.trustmanager.algorithm = PKIX
|
||||
ssl.truststore.location = null
|
||||
ssl.truststore.password = null
|
||||
ssl.truststore.type = JKS
|
||||
value.deserializer = class org.apache.kafka.common.serialization.StringDeserializer
|
||||
|
||||
2021-02-03 18:21:29,868 INFO [main] o.a.k.c.u.AppInfoParser [AppInfoParser.java:109] Kafka version : 2.0.1
|
||||
2021-02-03 18:21:29,868 INFO [main] o.a.k.c.u.AppInfoParser [AppInfoParser.java:110] Kafka commitId : fa14705e51bd2ce5
|
65
log/community/info/log-info-2021-02-04.0.log
Normal file
65
log/community/info/log-info-2021-02-04.0.log
Normal file
@ -0,0 +1,65 @@
|
||||
2021-02-04 21:39:48,571 INFO [restartedMain] c.g.c.CommunityApplication [StartupInfoLogger.java:50] Starting CommunityApplication on LAPTOP-5SJBI05C with PID 10528 (E:\GreateCommunity\target\classes started by 19124 in E:\GreateCommunity)
|
||||
2021-02-04 21:39:48,577 INFO [restartedMain] c.g.c.CommunityApplication [SpringApplication.java:675] No active profile set, falling back to default profiles: default
|
||||
2021-02-04 21:39:48,634 INFO [restartedMain] o.s.b.d.e.DevToolsPropertyDefaultsPostProcessor [DeferredLog.java:227] Devtools property defaults active! Set 'spring.devtools.add-properties' to 'false' to disable
|
||||
2021-02-04 21:39:48,634 INFO [restartedMain] o.s.b.d.e.DevToolsPropertyDefaultsPostProcessor [DeferredLog.java:227] For additional web related logging consider setting the 'logging.level.web' property to 'DEBUG'
|
||||
2021-02-04 21:39:49,322 INFO [restartedMain] o.s.d.r.c.RepositoryConfigurationDelegate [RepositoryConfigurationDelegate.java:244] Multiple Spring Data modules found, entering strict repository configuration mode!
|
||||
2021-02-04 21:39:49,324 INFO [restartedMain] o.s.d.r.c.RepositoryConfigurationDelegate [RepositoryConfigurationDelegate.java:126] Bootstrapping Spring Data repositories in DEFAULT mode.
|
||||
2021-02-04 21:39:49,385 INFO [restartedMain] o.s.d.r.c.RepositoryConfigurationDelegate [RepositoryConfigurationDelegate.java:182] Finished Spring Data repository scanning in 58ms. Found 1 repository interfaces.
|
||||
2021-02-04 21:39:49,395 INFO [restartedMain] o.s.d.r.c.RepositoryConfigurationDelegate [RepositoryConfigurationDelegate.java:244] Multiple Spring Data modules found, entering strict repository configuration mode!
|
||||
2021-02-04 21:39:49,396 INFO [restartedMain] o.s.d.r.c.RepositoryConfigurationDelegate [RepositoryConfigurationDelegate.java:126] Bootstrapping Spring Data repositories in DEFAULT mode.
|
||||
2021-02-04 21:39:49,410 INFO [restartedMain] o.s.d.r.c.RepositoryConfigurationExtensionSupport [RepositoryConfigurationExtensionSupport.java:363] Spring Data Redis - Could not safely identify store assignment for repository candidate interface com.greate.community.dao.elasticsearch.DiscussPostRepository.
|
||||
2021-02-04 21:39:49,410 INFO [restartedMain] o.s.d.r.c.RepositoryConfigurationDelegate [RepositoryConfigurationDelegate.java:182] Finished Spring Data repository scanning in 8ms. Found 0 repository interfaces.
|
||||
2021-02-04 21:39:49,711 INFO [restartedMain] c.u.j.c.EnableEncryptablePropertiesBeanFactoryPostProcessor [EnableEncryptablePropertiesBeanFactoryPostProcessor.java:48] Post-processing PropertySource instances
|
||||
2021-02-04 21:39:49,763 INFO [restartedMain] c.u.j.EncryptablePropertySourceConverter [EncryptablePropertySourceConverter.java:41] Converting PropertySource configurationProperties [org.springframework.boot.context.properties.source.ConfigurationPropertySourcesPropertySource] to AOP Proxy
|
||||
2021-02-04 21:39:49,763 INFO [restartedMain] c.u.j.EncryptablePropertySourceConverter [EncryptablePropertySourceConverter.java:41] Converting PropertySource servletConfigInitParams [org.springframework.core.env.PropertySource$StubPropertySource] to EncryptablePropertySourceWrapper
|
||||
2021-02-04 21:39:49,763 INFO [restartedMain] c.u.j.EncryptablePropertySourceConverter [EncryptablePropertySourceConverter.java:41] Converting PropertySource servletContextInitParams [org.springframework.core.env.PropertySource$StubPropertySource] to EncryptablePropertySourceWrapper
|
||||
2021-02-04 21:39:49,764 INFO [restartedMain] c.u.j.EncryptablePropertySourceConverter [EncryptablePropertySourceConverter.java:41] Converting PropertySource systemProperties [org.springframework.core.env.MapPropertySource] to EncryptableMapPropertySourceWrapper
|
||||
2021-02-04 21:39:49,764 INFO [restartedMain] c.u.j.EncryptablePropertySourceConverter [EncryptablePropertySourceConverter.java:41] Converting PropertySource systemEnvironment [org.springframework.boot.env.SystemEnvironmentPropertySourceEnvironmentPostProcessor$OriginAwareSystemEnvironmentPropertySource] to EncryptableSystemEnvironmentPropertySourceWrapper
|
||||
2021-02-04 21:39:49,764 INFO [restartedMain] c.u.j.EncryptablePropertySourceConverter [EncryptablePropertySourceConverter.java:41] Converting PropertySource random [org.springframework.boot.env.RandomValuePropertySource] to EncryptablePropertySourceWrapper
|
||||
2021-02-04 21:39:49,765 INFO [restartedMain] c.u.j.EncryptablePropertySourceConverter [EncryptablePropertySourceConverter.java:41] Converting PropertySource applicationConfig: [classpath:/application.properties] [org.springframework.boot.env.OriginTrackedMapPropertySource] to EncryptableMapPropertySourceWrapper
|
||||
2021-02-04 21:39:49,765 INFO [restartedMain] c.u.j.EncryptablePropertySourceConverter [EncryptablePropertySourceConverter.java:41] Converting PropertySource devtools [org.springframework.core.env.MapPropertySource] to EncryptableMapPropertySourceWrapper
|
||||
2021-02-04 21:39:49,788 INFO [restartedMain] c.u.j.f.DefaultLazyPropertyFilter [DefaultLazyPropertyFilter.java:34] Property Filter custom Bean not found with name 'encryptablePropertyFilter'. Initializing Default Property Filter
|
||||
2021-02-04 21:39:49,816 INFO [restartedMain] o.s.c.s.PostProcessorRegistrationDelegate$BeanPostProcessorChecker [PostProcessorRegistrationDelegate.java:330] Bean 'org.springframework.kafka.annotation.KafkaBootstrapConfiguration' of type [org.springframework.kafka.annotation.KafkaBootstrapConfiguration$$EnhancerBySpringCGLIB$$2fbe8c5] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying)
|
||||
2021-02-04 21:39:49,860 INFO [restartedMain] o.s.c.s.PostProcessorRegistrationDelegate$BeanPostProcessorChecker [PostProcessorRegistrationDelegate.java:330] Bean 'org.springframework.transaction.annotation.ProxyTransactionManagementConfiguration' of type [org.springframework.transaction.annotation.ProxyTransactionManagementConfiguration$$EnhancerBySpringCGLIB$$2f49d742] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying)
|
||||
2021-02-04 21:39:49,992 INFO [restartedMain] c.u.j.r.DefaultLazyPropertyResolver [DefaultLazyPropertyResolver.java:35] Property Resolver custom Bean not found with name 'encryptablePropertyResolver'. Initializing Default Property Resolver
|
||||
2021-02-04 21:39:49,994 INFO [restartedMain] c.u.j.d.DefaultLazyPropertyDetector [DefaultLazyPropertyDetector.java:33] Property Detector custom Bean not found with name 'encryptablePropertyDetector'. Initializing Default Property Detector
|
||||
2021-02-04 21:39:50,328 INFO [restartedMain] o.s.b.w.e.t.TomcatWebServer [TomcatWebServer.java:90] Tomcat initialized with port(s): 8080 (http)
|
||||
2021-02-04 21:39:50,339 INFO [restartedMain] o.a.c.h.Http11NioProtocol [DirectJDKLog.java:173] Initializing ProtocolHandler ["http-nio-8080"]
|
||||
2021-02-04 21:39:50,346 INFO [restartedMain] o.a.c.c.StandardService [DirectJDKLog.java:173] Starting service [Tomcat]
|
||||
2021-02-04 21:39:50,346 INFO [restartedMain] o.a.c.c.StandardEngine [DirectJDKLog.java:173] Starting Servlet engine: [Apache Tomcat/9.0.19]
|
||||
2021-02-04 21:39:50,449 INFO [restartedMain] o.a.c.c.C.[.[.[/echo] [DirectJDKLog.java:173] Initializing Spring embedded WebApplicationContext
|
||||
2021-02-04 21:39:50,449 INFO [restartedMain] o.s.w.c.ContextLoader [ServletWebServerApplicationContext.java:296] Root WebApplicationContext: initialization completed in 1814 ms
|
||||
2021-02-04 21:39:51,657 INFO [restartedMain] o.e.p.PluginsService [PluginsService.java:190] no modules loaded
|
||||
2021-02-04 21:39:51,658 INFO [restartedMain] o.e.p.PluginsService [PluginsService.java:193] loaded plugin [org.elasticsearch.index.reindex.ReindexPlugin]
|
||||
2021-02-04 21:39:51,658 INFO [restartedMain] o.e.p.PluginsService [PluginsService.java:193] loaded plugin [org.elasticsearch.join.ParentJoinPlugin]
|
||||
2021-02-04 21:39:51,658 INFO [restartedMain] o.e.p.PluginsService [PluginsService.java:193] loaded plugin [org.elasticsearch.percolator.PercolatorPlugin]
|
||||
2021-02-04 21:39:51,659 INFO [restartedMain] o.e.p.PluginsService [PluginsService.java:193] loaded plugin [org.elasticsearch.script.mustache.MustachePlugin]
|
||||
2021-02-04 21:39:51,659 INFO [restartedMain] o.e.p.PluginsService [PluginsService.java:193] loaded plugin [org.elasticsearch.transport.Netty4Plugin]
|
||||
2021-02-04 21:39:52,795 INFO [restartedMain] o.s.d.e.c.TransportClientFactoryBean [TransportClientFactoryBean.java:88] Adding transport node : 127.0.0.1:9300
|
||||
2021-02-04 21:39:56,301 INFO [restartedMain] o.s.b.d.a.OptionalLiveReloadServer [OptionalLiveReloadServer.java:57] LiveReload server is running on port 35729
|
||||
2021-02-04 21:39:57,217 INFO [restartedMain] o.s.s.c.ThreadPoolTaskExecutor [ExecutorConfigurationSupport.java:171] Initializing ExecutorService 'applicationTaskExecutor'
|
||||
2021-02-04 21:39:57,486 INFO [restartedMain] o.s.b.a.w.s.WelcomePageHandlerMapping [WelcomePageHandlerMapping.java:61] Adding welcome page template: index
|
||||
2021-02-04 21:39:58,651 INFO [restartedMain] c.z.h.HikariDataSource [HikariDataSource.java:110] HikariPool-1 - Starting...
|
||||
2021-02-04 21:39:58,835 INFO [restartedMain] c.z.h.HikariDataSource [HikariDataSource.java:123] HikariPool-1 - Start completed.
|
||||
2021-02-04 21:39:58,937 INFO [restartedMain] o.q.i.StdSchedulerFactory [StdSchedulerFactory.java:1208] Using default implementation for ThreadExecutor
|
||||
2021-02-04 21:39:58,950 INFO [restartedMain] o.q.c.SchedulerSignalerImpl [SchedulerSignalerImpl.java:61] Initialized Scheduler Signaller of type: class org.quartz.core.SchedulerSignalerImpl
|
||||
2021-02-04 21:39:58,951 INFO [restartedMain] o.q.c.QuartzScheduler [QuartzScheduler.java:229] Quartz Scheduler v.2.3.1 created.
|
||||
2021-02-04 21:39:58,956 INFO [restartedMain] o.s.s.q.LocalDataSourceJobStore [JobStoreSupport.java:672] Using db table-based data access locking (synchronization).
|
||||
2021-02-04 21:39:58,959 INFO [restartedMain] o.s.s.q.LocalDataSourceJobStore [JobStoreCMT.java:145] JobStoreCMT initialized.
|
||||
2021-02-04 21:39:58,960 INFO [restartedMain] o.q.c.QuartzScheduler [QuartzScheduler.java:294] Scheduler meta-data: Quartz Scheduler (v2.3.1) 'communityScheduler' with instanceId 'LAPTOP-5SJBI05C1612445998939'
|
||||
Scheduler class: 'org.quartz.core.QuartzScheduler' - running locally.
|
||||
NOT STARTED.
|
||||
Currently in standby mode.
|
||||
Number of jobs executed: 0
|
||||
Using thread pool 'org.quartz.simpl.SimpleThreadPool' - with 5 threads.
|
||||
Using job-store 'org.springframework.scheduling.quartz.LocalDataSourceJobStore' - which supports persistence. and is clustered.
|
||||
|
||||
2021-02-04 21:39:58,960 INFO [restartedMain] o.q.i.StdSchedulerFactory [StdSchedulerFactory.java:1362] Quartz scheduler 'communityScheduler' initialized from an externally provided properties instance.
|
||||
2021-02-04 21:39:58,960 INFO [restartedMain] o.q.i.StdSchedulerFactory [StdSchedulerFactory.java:1366] Quartz scheduler version: 2.3.1
|
||||
2021-02-04 21:39:58,960 INFO [restartedMain] o.q.c.QuartzScheduler [QuartzScheduler.java:2293] JobFactory set to: org.springframework.scheduling.quartz.SpringBeanJobFactory@3abb7185
|
||||
2021-02-04 21:39:59,089 INFO [restartedMain] o.s.b.a.s.s.UserDetailsServiceAutoConfiguration [UserDetailsServiceAutoConfiguration.java:87]
|
||||
|
||||
Using generated security password: b0b97a37-0b8f-4fad-83af-e19207ac88d6
|
||||
|
||||
2021-02-04 21:39:59,149 INFO [restartedMain] o.s.s.w.DefaultSecurityFilterChain [DefaultSecurityFilterChain.java:43] Creating filter chain: Ant [pattern='/resources/**'], []
|
||||
2021-02-04 21:39:59,191 INFO [restartedMain] o.s.s.w.DefaultSecurityFilterChain [DefaultSecurityFilterChain.java:43] Creating filter chain: any request, [org.springframework.security.web.context.request.async.WebAsyncManagerIntegrationFilter@4a95c7cd, org.springframework.security.web.context.SecurityContextPersistenceFilter@13c09085, org.springframework.security.web.header.HeaderWriterFilter@647284ce, org.springframework.security.web.authentication.logout.LogoutFilter@76f61163, org.springframework.security.web.savedrequest.RequestCacheAwareFilter@7c0c6d12, org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter@4d45e645, org.springframework.security.web.authentication.AnonymousAuthenticationFilter@5ef2db3d, org.springframework.security.web.session.SessionManagementFilter@4b066f5d, org.springframework.security.web.access.ExceptionTranslationFilter@707efefa, org.springframework.security.web.access.intercept.FilterSecurityInterceptor@bfbd66a]
|
23324
log/community/info/log-info-2021-02-06.0.log
Normal file
23324
log/community/info/log-info-2021-02-06.0.log
Normal file
File diff suppressed because it is too large
Load Diff
9483
log/community/info/log-info-2021-02-07.0.log
Normal file
9483
log/community/info/log-info-2021-02-07.0.log
Normal file
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
@ -1,14 +1,169 @@
|
||||
2021-01-31 12:41:08,662 WARN [http-nio-8080-exec-2] o.s.w.s.m.m.a.ExceptionHandlerExceptionResolver [AbstractHandlerExceptionResolver.java:198] Resolved [java.lang.ClassCastException: java.util.ArrayList cannot be cast to java.lang.String]
|
||||
2021-01-31 12:41:53,872 WARN [http-nio-8080-exec-8] o.s.w.s.m.m.a.ExceptionHandlerExceptionResolver [AbstractHandlerExceptionResolver.java:198] Resolved [java.lang.ClassCastException: java.util.ArrayList cannot be cast to java.lang.String]
|
||||
2021-01-31 12:43:41,269 WARN [http-nio-8080-exec-8] o.s.w.s.m.m.a.ExceptionHandlerExceptionResolver [AbstractHandlerExceptionResolver.java:198] Resolved [java.lang.ClassCastException: java.util.ArrayList cannot be cast to java.lang.String]
|
||||
2021-01-31 12:46:47,290 WARN [http-nio-8080-exec-2] o.s.w.s.m.m.a.ExceptionHandlerExceptionResolver [AbstractHandlerExceptionResolver.java:198] Resolved [java.lang.ClassCastException: java.util.ArrayList cannot be cast to java.lang.String]
|
||||
2021-01-31 12:50:56,887 WARN [http-nio-8080-exec-8] o.s.w.s.m.m.a.ExceptionHandlerExceptionResolver [AbstractHandlerExceptionResolver.java:198] Resolved [java.lang.ClassCastException: java.util.ArrayList cannot be cast to java.lang.String]
|
||||
2021-01-31 13:00:31,265 WARN [http-nio-8080-exec-4] o.s.w.s.m.m.a.ExceptionHandlerExceptionResolver [AbstractHandlerExceptionResolver.java:198] Resolved [org.springframework.web.method.annotation.MethodArgumentTypeMismatchException: Failed to convert value of type 'java.lang.String' to required type 'int'; nested exception is java.lang.NumberFormatException: For input string: "my-reply.html"]
|
||||
2021-01-31 17:00:56,014 WARN [http-nio-8080-exec-3] o.s.w.s.m.m.a.ExceptionHandlerExceptionResolver [AbstractHandlerExceptionResolver.java:198] Resolved [org.apache.ibatis.binding.BindingException: Invalid bound statement (not found): com.greate.community.dao.DiscussPostMapper.updateCommentCount]
|
||||
2021-01-31 17:02:14,746 WARN [http-nio-8080-exec-4] o.s.w.s.m.m.a.ExceptionHandlerExceptionResolver [AbstractHandlerExceptionResolver.java:198] Resolved [org.apache.ibatis.binding.BindingException: Invalid bound statement (not found): com.greate.community.dao.DiscussPostMapper.updateCommentCount]
|
||||
2021-01-31 17:03:10,083 WARN [http-nio-8080-exec-6] o.s.w.s.m.m.a.ExceptionHandlerExceptionResolver [AbstractHandlerExceptionResolver.java:198] Resolved [org.apache.ibatis.binding.BindingException: Invalid bound statement (not found): com.greate.community.dao.DiscussPostMapper.updateCommentCount]
|
||||
2021-01-31 17:03:20,433 WARN [http-nio-8080-exec-10] o.s.w.s.m.m.a.ExceptionHandlerExceptionResolver [AbstractHandlerExceptionResolver.java:198] Resolved [org.apache.ibatis.binding.BindingException: Invalid bound statement (not found): com.greate.community.dao.DiscussPostMapper.updateCommentCount]
|
||||
2021-01-31 17:07:07,426 WARN [http-nio-8080-exec-5] o.s.w.s.m.m.a.ExceptionHandlerExceptionResolver [AbstractHandlerExceptionResolver.java:198] Resolved [org.apache.ibatis.binding.BindingException: Invalid bound statement (not found): com.greate.community.dao.DiscussPostMapper.updateCommentCount]
|
||||
2021-01-31 17:09:49,019 WARN [http-nio-8080-exec-8] o.s.w.s.m.m.a.ExceptionHandlerExceptionResolver [AbstractHandlerExceptionResolver.java:198] Resolved [org.apache.ibatis.binding.BindingException: Invalid bound statement (not found): com.greate.community.dao.DiscussPostMapper.updateCommentCount]
|
||||
2021-01-31 17:26:47,312 WARN [http-nio-8080-exec-1] o.s.w.s.m.m.a.ExceptionHandlerExceptionResolver [AbstractHandlerExceptionResolver.java:198] Resolved [NoNodeAvailableException[None of the configured nodes are available: [{#transport#-1}{GL6pOJCiTqyKMUgG5cRPbg}{127.0.0.1}{127.0.0.1:9300}]]]
|
||||
2021-01-31 17:26:52,632 WARN [http-nio-8080-exec-3] o.s.w.s.m.m.a.ExceptionHandlerExceptionResolver [AbstractHandlerExceptionResolver.java:198] Resolved [NoNodeAvailableException[None of the configured nodes are available: [{#transport#-1}{GL6pOJCiTqyKMUgG5cRPbg}{127.0.0.1}{127.0.0.1:9300}]]]
|
||||
2021-02-06 15:42:11,487 WARN [restartedMain] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-06 15:42:13,594 WARN [restartedMain] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-06 15:42:15,700 WARN [restartedMain] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-06 15:42:17,909 WARN [restartedMain] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-06 15:42:20,316 WARN [restartedMain] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-06 15:42:23,222 WARN [restartedMain] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-06 15:42:26,332 WARN [restartedMain] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-06 15:42:29,340 WARN [restartedMain] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-06 15:42:32,550 WARN [restartedMain] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-06 15:42:35,760 WARN [restartedMain] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-06 15:42:38,769 WARN [restartedMain] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-06 15:42:41,877 WARN [restartedMain] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-06 15:42:44,984 WARN [restartedMain] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-06 15:42:47,994 WARN [restartedMain] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-06 15:42:51,101 WARN [restartedMain] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-06 15:42:54,110 WARN [restartedMain] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-06 15:42:57,318 WARN [restartedMain] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-06 15:43:00,234 WARN [restartedMain] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-06 15:43:03,143 WARN [restartedMain] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-06 15:43:06,254 WARN [restartedMain] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-06 15:43:09,161 WARN [restartedMain] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-06 15:43:09,467 WARN [restartedMain] o.s.b.w.s.c.AnnotationConfigServletWebServerApplicationContext [AbstractApplicationContext.java:557] Exception encountered during context initialization - cancelling refresh attempt: org.springframework.context.ApplicationContextException: Failed to start bean 'org.springframework.kafka.config.internalKafkaListenerEndpointRegistry'; nested exception is org.apache.kafka.common.errors.TimeoutException: Timeout expired while fetching topic metadata
|
||||
2021-02-06 15:45:36,751 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#1-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-8, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:45:36,789 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-6, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:45:36,789 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#3-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-4, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:45:36,798 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#2-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-2, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:45:38,853 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#1-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-8, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:45:38,893 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-6, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:45:38,909 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#2-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-2, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:45:38,946 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#3-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-4, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:45:41,061 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#1-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-8, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:45:41,100 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-6, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:45:41,118 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#2-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-2, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:45:41,153 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#3-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-4, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:45:43,471 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#1-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-8, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:45:43,513 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#3-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-4, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:45:43,530 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#2-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-2, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:45:43,610 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-6, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:45:46,242 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#2-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-2, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:45:46,278 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#3-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-4, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:45:46,436 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#1-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-8, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:45:46,524 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-6, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:45:49,139 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#3-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-4, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:45:49,409 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#2-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-2, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:45:49,639 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-6, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:45:49,650 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#1-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-8, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:45:52,305 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#3-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-4, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:45:52,421 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#2-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-2, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:45:52,653 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-6, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:45:52,667 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#1-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-8, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:45:55,117 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#3-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-4, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:45:55,287 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#2-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-2, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:45:55,628 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#1-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-8, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:45:55,768 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-6, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:45:58,077 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#3-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-4, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:45:58,404 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#2-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-2, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:45:58,693 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#1-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-8, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:45:58,884 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-6, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:00,950 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#3-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-4, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:01,571 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#2-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-2, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:01,795 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-6, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:01,814 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#1-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-8, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:04,167 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#3-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-4, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:04,435 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#2-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-2, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:04,657 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-6, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:04,826 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#1-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-8, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:07,200 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#3-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-4, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:07,498 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#2-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-2, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:07,669 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-6, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:07,850 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#1-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-8, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:10,215 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#3-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-4, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:10,664 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#2-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-2, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:10,788 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-6, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:10,815 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#1-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-8, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:13,234 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#3-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-4, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:13,897 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#2-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-2, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:13,930 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#1-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-8, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:14,007 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-6, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:16,247 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#3-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-4, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:16,857 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#2-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-2, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:17,044 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#1-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-8, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:17,222 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-6, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:18,741 WARN [http-nio-8080-exec-1] o.s.w.s.m.m.a.ExceptionHandlerExceptionResolver [AbstractHandlerExceptionResolver.java:198] Resolved [org.springframework.data.redis.RedisConnectionFailureException: Unable to connect to Redis; nested exception is io.lettuce.core.RedisConnectionException: Unable to connect to localhost:6379]
|
||||
2021-02-06 15:46:19,106 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#3-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-4, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:20,024 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#2-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-2, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:20,032 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-6, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:20,107 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#1-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-8, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:20,900 WARN [http-nio-8080-exec-2] o.s.w.s.m.m.a.ExceptionHandlerExceptionResolver [AbstractHandlerExceptionResolver.java:198] Resolved [org.springframework.data.redis.RedisConnectionFailureException: Unable to connect to Redis; nested exception is io.lettuce.core.RedisConnectionException: Unable to connect to localhost:6379]
|
||||
2021-02-06 15:46:22,169 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#3-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-4, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:22,883 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#2-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-2, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:22,937 WARN [http-nio-8080-exec-3] o.s.w.s.m.m.a.ExceptionHandlerExceptionResolver [AbstractHandlerExceptionResolver.java:198] Resolved [org.springframework.data.redis.RedisConnectionFailureException: Unable to connect to Redis; nested exception is io.lettuce.core.RedisConnectionException: Unable to connect to localhost:6379]
|
||||
2021-02-06 15:46:23,045 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-6, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:23,120 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#1-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-8, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:24,986 WARN [http-nio-8080-exec-4] o.s.w.s.m.m.a.ExceptionHandlerExceptionResolver [AbstractHandlerExceptionResolver.java:198] Resolved [org.springframework.data.redis.RedisConnectionFailureException: Unable to connect to Redis; nested exception is io.lettuce.core.RedisConnectionException: Unable to connect to localhost:6379]
|
||||
2021-02-06 15:46:25,282 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#3-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-4, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:25,792 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#2-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-2, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:26,109 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-6, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:26,181 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#1-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-8, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:27,020 WARN [http-nio-8080-exec-5] o.s.w.s.m.m.a.ExceptionHandlerExceptionResolver [AbstractHandlerExceptionResolver.java:198] Resolved [org.springframework.data.redis.RedisConnectionFailureException: Unable to connect to Redis; nested exception is io.lettuce.core.RedisConnectionException: Unable to connect to localhost:6379]
|
||||
2021-02-06 15:46:28,140 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#3-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-4, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:28,649 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#2-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-2, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:29,061 WARN [http-nio-8080-exec-6] o.s.w.s.m.m.a.ExceptionHandlerExceptionResolver [AbstractHandlerExceptionResolver.java:198] Resolved [org.springframework.data.redis.RedisConnectionFailureException: Unable to connect to Redis; nested exception is io.lettuce.core.RedisConnectionException: Unable to connect to localhost:6379]
|
||||
2021-02-06 15:46:29,119 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-6, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:29,294 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#1-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-8, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:31,051 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#3-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-4, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:31,095 WARN [http-nio-8080-exec-7] o.s.w.s.m.m.a.ExceptionHandlerExceptionResolver [AbstractHandlerExceptionResolver.java:198] Resolved [org.springframework.data.redis.RedisConnectionFailureException: Unable to connect to Redis; nested exception is io.lettuce.core.RedisConnectionException: Unable to connect to localhost:6379]
|
||||
2021-02-06 15:46:31,562 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#2-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-2, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:32,336 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-6, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:32,459 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#1-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-8, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:33,142 WARN [http-nio-8080-exec-8] o.s.w.s.m.m.a.ExceptionHandlerExceptionResolver [AbstractHandlerExceptionResolver.java:198] Resolved [org.springframework.data.redis.RedisConnectionFailureException: Unable to connect to Redis; nested exception is io.lettuce.core.RedisConnectionException: Unable to connect to localhost:6379]
|
||||
2021-02-06 15:46:34,114 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#3-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-4, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:34,371 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#2-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-2, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:35,190 WARN [http-nio-8080-exec-9] o.s.w.s.m.m.a.ExceptionHandlerExceptionResolver [AbstractHandlerExceptionResolver.java:198] Resolved [org.springframework.data.redis.RedisConnectionFailureException: Unable to connect to Redis; nested exception is io.lettuce.core.RedisConnectionException: Unable to connect to localhost:6379]
|
||||
2021-02-06 15:46:35,197 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-6, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:35,625 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#1-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-8, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:37,179 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#3-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-4, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:37,234 WARN [http-nio-8080-exec-10] o.s.w.s.m.m.a.ExceptionHandlerExceptionResolver [AbstractHandlerExceptionResolver.java:198] Resolved [org.springframework.data.redis.RedisConnectionFailureException: Unable to connect to Redis; nested exception is io.lettuce.core.RedisConnectionException: Unable to connect to localhost:6379]
|
||||
2021-02-06 15:46:37,283 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#2-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-2, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:38,259 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-6, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:38,589 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#1-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-8, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:39,271 WARN [http-nio-8080-exec-1] o.s.w.s.m.m.a.ExceptionHandlerExceptionResolver [AbstractHandlerExceptionResolver.java:198] Resolved [org.springframework.data.redis.RedisConnectionFailureException: Unable to connect to Redis; nested exception is io.lettuce.core.RedisConnectionException: Unable to connect to localhost:6379]
|
||||
2021-02-06 15:46:40,240 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#3-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-4, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:40,349 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#2-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-2, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:41,316 WARN [http-nio-8080-exec-2] o.s.w.s.m.m.a.ExceptionHandlerExceptionResolver [AbstractHandlerExceptionResolver.java:198] Resolved [org.springframework.data.redis.RedisConnectionFailureException: Unable to connect to Redis; nested exception is io.lettuce.core.RedisConnectionException: Unable to connect to localhost:6379]
|
||||
2021-02-06 15:46:41,472 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-6, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:46:41,552 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#1-0-C-1] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-8, groupId=test-consumer-group] Connection to node 0 could not be established. Broker may not be available.
|
||||
2021-02-06 15:55:57,290 WARN [http-nio-8080-exec-7] o.s.w.s.m.m.a.ExceptionHandlerExceptionResolver [AbstractHandlerExceptionResolver.java:198] Resolved [org.springframework.web.method.annotation.MethodArgumentTypeMismatchException: Failed to convert value of type 'java.lang.String' to required type 'int'; nested exception is java.lang.NumberFormatException: For input string: "my-post.html"]
|
||||
2021-02-06 15:56:11,045 WARN [http-nio-8080-exec-10] o.s.w.s.m.m.a.ExceptionHandlerExceptionResolver [AbstractHandlerExceptionResolver.java:198] Resolved [org.springframework.web.method.annotation.MethodArgumentTypeMismatchException: Failed to convert value of type 'java.lang.String' to required type 'int'; nested exception is java.lang.NumberFormatException: For input string: "my-post.html"]
|
||||
2021-02-06 15:56:42,777 WARN [http-nio-8080-exec-10] o.s.w.s.m.m.a.ExceptionHandlerExceptionResolver [AbstractHandlerExceptionResolver.java:198] Resolved [org.springframework.web.method.annotation.MethodArgumentTypeMismatchException: Failed to convert value of type 'java.lang.String' to required type 'int'; nested exception is java.lang.NumberFormatException: For input string: "profile.html"]
|
||||
2021-02-06 16:26:25,206 WARN [HikariPool-1 housekeeper] c.z.h.p.HikariPool [HikariPool.java:766] HikariPool-1 - Thread starvation or clock leap detected (housekeeper delta=1m3s42ms590µs200ns).
|
||||
2021-02-06 16:26:49,170 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#1-0-C-1] o.a.k.c.c.i.ConsumerCoordinator [ConsumerCoordinator.java:737] [Consumer clientId=consumer-8, groupId=test-consumer-group] Asynchronous auto-commit of offsets {share-0=OffsetAndMetadata{offset=0, metadata=''}} failed: Commit cannot be completed since the group has already rebalanced and assigned the partitions to another member. This means that the time between subsequent calls to poll() was longer than the configured max.poll.interval.ms, which typically implies that the poll loop is spending too much time message processing. You can address this either by increasing the session timeout or by reducing the maximum size of batches returned in poll() with max.poll.records.
|
||||
2021-02-06 16:26:49,170 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1] o.a.k.c.c.i.ConsumerCoordinator [ConsumerCoordinator.java:737] [Consumer clientId=consumer-6, groupId=test-consumer-group] Asynchronous auto-commit of offsets {delete-0=OffsetAndMetadata{offset=0, metadata=''}} failed: Commit cannot be completed since the group has already rebalanced and assigned the partitions to another member. This means that the time between subsequent calls to poll() was longer than the configured max.poll.interval.ms, which typically implies that the poll loop is spending too much time message processing. You can address this either by increasing the session timeout or by reducing the maximum size of batches returned in poll() with max.poll.records.
|
||||
2021-02-06 16:26:49,171 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1] o.a.k.c.c.i.ConsumerCoordinator [ConsumerCoordinator.java:759] [Consumer clientId=consumer-6, groupId=test-consumer-group] Synchronous auto-commit of offsets {delete-0=OffsetAndMetadata{offset=0, metadata=''}} failed: Commit cannot be completed since the group has already rebalanced and assigned the partitions to another member. This means that the time between subsequent calls to poll() was longer than the configured max.poll.interval.ms, which typically implies that the poll loop is spending too much time message processing. You can address this either by increasing the session timeout or by reducing the maximum size of batches returned in poll() with max.poll.records.
|
||||
2021-02-06 16:26:49,171 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#2-0-C-1] o.a.k.c.c.i.ConsumerCoordinator [ConsumerCoordinator.java:759] [Consumer clientId=consumer-2, groupId=test-consumer-group] Synchronous auto-commit of offsets {publish-0=OffsetAndMetadata{offset=1, metadata=''}} failed: Commit cannot be completed since the group has already rebalanced and assigned the partitions to another member. This means that the time between subsequent calls to poll() was longer than the configured max.poll.interval.ms, which typically implies that the poll loop is spending too much time message processing. You can address this either by increasing the session timeout or by reducing the maximum size of batches returned in poll() with max.poll.records.
|
||||
2021-02-06 16:26:49,171 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#1-0-C-1] o.a.k.c.c.i.ConsumerCoordinator [ConsumerCoordinator.java:759] [Consumer clientId=consumer-8, groupId=test-consumer-group] Synchronous auto-commit of offsets {share-0=OffsetAndMetadata{offset=0, metadata=''}} failed: Commit cannot be completed since the group has already rebalanced and assigned the partitions to another member. This means that the time between subsequent calls to poll() was longer than the configured max.poll.interval.ms, which typically implies that the poll loop is spending too much time message processing. You can address this either by increasing the session timeout or by reducing the maximum size of batches returned in poll() with max.poll.records.
|
||||
2021-02-06 16:26:49,171 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#3-0-C-1] o.a.k.c.c.i.ConsumerCoordinator [ConsumerCoordinator.java:759] [Consumer clientId=consumer-4, groupId=test-consumer-group] Synchronous auto-commit of offsets {comment-0=OffsetAndMetadata{offset=0, metadata=''}, like-0=OffsetAndMetadata{offset=0, metadata=''}, follow-0=OffsetAndMetadata{offset=0, metadata=''}} failed: Commit cannot be completed since the group has already rebalanced and assigned the partitions to another member. This means that the time between subsequent calls to poll() was longer than the configured max.poll.interval.ms, which typically implies that the poll loop is spending too much time message processing. You can address this either by increasing the session timeout or by reducing the maximum size of batches returned in poll() with max.poll.records.
|
||||
2021-02-06 16:28:16,540 WARN [HikariPool-1 housekeeper] c.z.h.p.HikariPool [HikariPool.java:766] HikariPool-1 - Thread starvation or clock leap detected (housekeeper delta=1m15s855ms884µs400ns).
|
||||
2021-02-06 16:28:28,134 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#3-0-C-1] o.a.k.c.c.i.ConsumerCoordinator [ConsumerCoordinator.java:737] [Consumer clientId=consumer-4, groupId=test-consumer-group] Asynchronous auto-commit of offsets {comment-0=OffsetAndMetadata{offset=0, metadata=''}, like-0=OffsetAndMetadata{offset=0, metadata=''}, follow-0=OffsetAndMetadata{offset=0, metadata=''}} failed: Commit cannot be completed since the group has already rebalanced and assigned the partitions to another member. This means that the time between subsequent calls to poll() was longer than the configured max.poll.interval.ms, which typically implies that the poll loop is spending too much time message processing. You can address this either by increasing the session timeout or by reducing the maximum size of batches returned in poll() with max.poll.records.
|
||||
2021-02-06 16:28:28,134 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#1-0-C-1] o.a.k.c.c.i.ConsumerCoordinator [ConsumerCoordinator.java:737] [Consumer clientId=consumer-8, groupId=test-consumer-group] Asynchronous auto-commit of offsets {share-0=OffsetAndMetadata{offset=0, metadata=''}} failed: Commit cannot be completed since the group has already rebalanced and assigned the partitions to another member. This means that the time between subsequent calls to poll() was longer than the configured max.poll.interval.ms, which typically implies that the poll loop is spending too much time message processing. You can address this either by increasing the session timeout or by reducing the maximum size of batches returned in poll() with max.poll.records.
|
||||
2021-02-06 16:28:28,134 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1] o.a.k.c.c.i.ConsumerCoordinator [ConsumerCoordinator.java:737] [Consumer clientId=consumer-6, groupId=test-consumer-group] Asynchronous auto-commit of offsets {delete-0=OffsetAndMetadata{offset=0, metadata=''}} failed: Commit cannot be completed since the group has already rebalanced and assigned the partitions to another member. This means that the time between subsequent calls to poll() was longer than the configured max.poll.interval.ms, which typically implies that the poll loop is spending too much time message processing. You can address this either by increasing the session timeout or by reducing the maximum size of batches returned in poll() with max.poll.records.
|
||||
2021-02-06 16:28:28,135 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#2-0-C-1] o.a.k.c.c.i.ConsumerCoordinator [ConsumerCoordinator.java:737] [Consumer clientId=consumer-2, groupId=test-consumer-group] Asynchronous auto-commit of offsets {publish-0=OffsetAndMetadata{offset=1, metadata=''}} failed: Commit cannot be completed since the group has already rebalanced and assigned the partitions to another member. This means that the time between subsequent calls to poll() was longer than the configured max.poll.interval.ms, which typically implies that the poll loop is spending too much time message processing. You can address this either by increasing the session timeout or by reducing the maximum size of batches returned in poll() with max.poll.records.
|
||||
2021-02-06 16:28:28,135 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#3-0-C-1] o.a.k.c.c.i.ConsumerCoordinator [ConsumerCoordinator.java:737] [Consumer clientId=consumer-4, groupId=test-consumer-group] Asynchronous auto-commit of offsets {comment-0=OffsetAndMetadata{offset=0, metadata=''}, like-0=OffsetAndMetadata{offset=0, metadata=''}, follow-0=OffsetAndMetadata{offset=0, metadata=''}} failed: Commit cannot be completed since the group has already rebalanced and assigned the partitions to another member. This means that the time between subsequent calls to poll() was longer than the configured max.poll.interval.ms, which typically implies that the poll loop is spending too much time message processing. You can address this either by increasing the session timeout or by reducing the maximum size of batches returned in poll() with max.poll.records.
|
||||
2021-02-06 16:28:33,092 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#2-0-C-1] o.a.k.c.c.i.ConsumerCoordinator [ConsumerCoordinator.java:737] [Consumer clientId=consumer-2, groupId=test-consumer-group] Asynchronous auto-commit of offsets {publish-0=OffsetAndMetadata{offset=1, metadata=''}} failed: Commit cannot be completed since the group has already rebalanced and assigned the partitions to another member. This means that the time between subsequent calls to poll() was longer than the configured max.poll.interval.ms, which typically implies that the poll loop is spending too much time message processing. You can address this either by increasing the session timeout or by reducing the maximum size of batches returned in poll() with max.poll.records.
|
||||
2021-02-06 16:29:12,217 WARN [HikariPool-1 housekeeper] c.z.h.p.HikariPool [HikariPool.java:766] HikariPool-1 - Thread starvation or clock leap detected (housekeeper delta=55s677ms314µs400ns).
|
||||
2021-02-06 19:52:28,661 WARN [restartedMain] o.a.c.c.C.[.[.[/echo] [DirectJDKLog.java:173] Cannot deserialize session attribute [SPRING_SECURITY_CONTEXT] for session [6D364877F06640DC8D84C00323B1A703]
|
||||
2021-02-06 19:53:19,325 WARN [HikariPool-1 housekeeper] c.z.h.p.HikariPool [HikariPool.java:766] HikariPool-1 - Thread starvation or clock leap detected (housekeeper delta=46s548ms644µs301ns).
|
||||
2021-02-06 19:53:46,733 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1] o.a.k.c.c.i.ConsumerCoordinator [ConsumerCoordinator.java:737] [Consumer clientId=consumer-6, groupId=test-consumer-group] Asynchronous auto-commit of offsets {delete-0=OffsetAndMetadata{offset=0, metadata=''}} failed: Commit cannot be completed since the group has already rebalanced and assigned the partitions to another member. This means that the time between subsequent calls to poll() was longer than the configured max.poll.interval.ms, which typically implies that the poll loop is spending too much time message processing. You can address this either by increasing the session timeout or by reducing the maximum size of batches returned in poll() with max.poll.records.
|
||||
2021-02-06 19:53:46,733 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#1-0-C-1] o.a.k.c.c.i.ConsumerCoordinator [ConsumerCoordinator.java:737] [Consumer clientId=consumer-8, groupId=test-consumer-group] Asynchronous auto-commit of offsets {share-0=OffsetAndMetadata{offset=0, metadata=''}} failed: Commit cannot be completed since the group has already rebalanced and assigned the partitions to another member. This means that the time between subsequent calls to poll() was longer than the configured max.poll.interval.ms, which typically implies that the poll loop is spending too much time message processing. You can address this either by increasing the session timeout or by reducing the maximum size of batches returned in poll() with max.poll.records.
|
||||
2021-02-06 19:53:46,733 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#3-0-C-1] o.a.k.c.c.i.ConsumerCoordinator [ConsumerCoordinator.java:737] [Consumer clientId=consumer-4, groupId=test-consumer-group] Asynchronous auto-commit of offsets {comment-0=OffsetAndMetadata{offset=0, metadata=''}, like-0=OffsetAndMetadata{offset=0, metadata=''}, follow-0=OffsetAndMetadata{offset=0, metadata=''}} failed: Commit cannot be completed since the group has already rebalanced and assigned the partitions to another member. This means that the time between subsequent calls to poll() was longer than the configured max.poll.interval.ms, which typically implies that the poll loop is spending too much time message processing. You can address this either by increasing the session timeout or by reducing the maximum size of batches returned in poll() with max.poll.records.
|
||||
2021-02-06 19:53:46,733 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#2-0-C-1] o.a.k.c.c.i.ConsumerCoordinator [ConsumerCoordinator.java:737] [Consumer clientId=consumer-2, groupId=test-consumer-group] Asynchronous auto-commit of offsets {publish-0=OffsetAndMetadata{offset=1, metadata=''}} failed: Commit cannot be completed since the group has already rebalanced and assigned the partitions to another member. This means that the time between subsequent calls to poll() was longer than the configured max.poll.interval.ms, which typically implies that the poll loop is spending too much time message processing. You can address this either by increasing the session timeout or by reducing the maximum size of batches returned in poll() with max.poll.records.
|
||||
2021-02-06 19:53:46,734 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#1-0-C-1] o.a.k.c.c.i.ConsumerCoordinator [ConsumerCoordinator.java:759] [Consumer clientId=consumer-8, groupId=test-consumer-group] Synchronous auto-commit of offsets {share-0=OffsetAndMetadata{offset=0, metadata=''}} failed: Commit cannot be completed since the group has already rebalanced and assigned the partitions to another member. This means that the time between subsequent calls to poll() was longer than the configured max.poll.interval.ms, which typically implies that the poll loop is spending too much time message processing. You can address this either by increasing the session timeout or by reducing the maximum size of batches returned in poll() with max.poll.records.
|
||||
2021-02-06 19:53:46,734 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#2-0-C-1] o.a.k.c.c.i.ConsumerCoordinator [ConsumerCoordinator.java:759] [Consumer clientId=consumer-2, groupId=test-consumer-group] Synchronous auto-commit of offsets {publish-0=OffsetAndMetadata{offset=1, metadata=''}} failed: Commit cannot be completed since the group has already rebalanced and assigned the partitions to another member. This means that the time between subsequent calls to poll() was longer than the configured max.poll.interval.ms, which typically implies that the poll loop is spending too much time message processing. You can address this either by increasing the session timeout or by reducing the maximum size of batches returned in poll() with max.poll.records.
|
||||
2021-02-06 19:53:46,734 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1] o.a.k.c.c.i.ConsumerCoordinator [ConsumerCoordinator.java:759] [Consumer clientId=consumer-6, groupId=test-consumer-group] Synchronous auto-commit of offsets {delete-0=OffsetAndMetadata{offset=0, metadata=''}} failed: Commit cannot be completed since the group has already rebalanced and assigned the partitions to another member. This means that the time between subsequent calls to poll() was longer than the configured max.poll.interval.ms, which typically implies that the poll loop is spending too much time message processing. You can address this either by increasing the session timeout or by reducing the maximum size of batches returned in poll() with max.poll.records.
|
||||
2021-02-06 19:53:46,734 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#3-0-C-1] o.a.k.c.c.i.ConsumerCoordinator [ConsumerCoordinator.java:759] [Consumer clientId=consumer-4, groupId=test-consumer-group] Synchronous auto-commit of offsets {comment-0=OffsetAndMetadata{offset=0, metadata=''}, like-0=OffsetAndMetadata{offset=0, metadata=''}, follow-0=OffsetAndMetadata{offset=0, metadata=''}} failed: Commit cannot be completed since the group has already rebalanced and assigned the partitions to another member. This means that the time between subsequent calls to poll() was longer than the configured max.poll.interval.ms, which typically implies that the poll loop is spending too much time message processing. You can address this either by increasing the session timeout or by reducing the maximum size of batches returned in poll() with max.poll.records.
|
||||
2021-02-06 19:53:52,595 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#1-0-C-1] o.a.k.c.c.i.ConsumerCoordinator [ConsumerCoordinator.java:737] [Consumer clientId=consumer-8, groupId=test-consumer-group] Asynchronous auto-commit of offsets {share-0=OffsetAndMetadata{offset=0, metadata=''}} failed: Commit cannot be completed since the group has already rebalanced and assigned the partitions to another member. This means that the time between subsequent calls to poll() was longer than the configured max.poll.interval.ms, which typically implies that the poll loop is spending too much time message processing. You can address this either by increasing the session timeout or by reducing the maximum size of batches returned in poll() with max.poll.records.
|
||||
2021-02-06 19:54:28,911 WARN [HikariPool-1 housekeeper] c.z.h.p.HikariPool [HikariPool.java:766] HikariPool-1 - Thread starvation or clock leap detected (housekeeper delta=1m9s587ms579µs900ns).
|
||||
2021-02-06 20:13:20,594 WARN [http-nio-8080-exec-10] o.s.w.s.m.m.a.ExceptionHandlerExceptionResolver [AbstractHandlerExceptionResolver.java:198] Resolved [org.springframework.web.method.annotation.MethodArgumentTypeMismatchException: Failed to convert value of type 'java.lang.String' to required type 'int'; nested exception is java.lang.NumberFormatException: For input string: "my-reply.html"]
|
||||
2021-02-06 21:04:19,390 WARN [http-nio-8080-exec-8] o.s.w.s.m.m.a.ExceptionHandlerExceptionResolver [AbstractHandlerExceptionResolver.java:198] Resolved [org.springframework.web.method.annotation.MethodArgumentTypeMismatchException: Failed to convert value of type 'java.lang.String' to required type 'int'; nested exception is java.lang.NumberFormatException: For input string: "my-reply.html"]
|
||||
2021-02-06 21:13:54,712 WARN [http-nio-8080-exec-10] o.s.w.s.m.m.a.ExceptionHandlerExceptionResolver [AbstractHandlerExceptionResolver.java:198] Resolved [org.springframework.web.method.annotation.MethodArgumentTypeMismatchException: Failed to convert value of type 'java.lang.String' to required type 'int'; nested exception is java.lang.NumberFormatException: For input string: "my-reply.html"]
|
||||
2021-02-06 21:19:52,791 WARN [HikariPool-1 housekeeper] c.z.h.p.HikariPool [HikariPool.java:766] HikariPool-1 - Thread starvation or clock leap detected (housekeeper delta=1m54s355ms222µs201ns).
|
||||
2021-02-06 21:19:52,808 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1] o.a.k.c.c.i.ConsumerCoordinator [ConsumerCoordinator.java:737] [Consumer clientId=consumer-6, groupId=test-consumer-group] Asynchronous auto-commit of offsets {publish-0=OffsetAndMetadata{offset=2, metadata=''}} failed: Commit cannot be completed since the group has already rebalanced and assigned the partitions to another member. This means that the time between subsequent calls to poll() was longer than the configured max.poll.interval.ms, which typically implies that the poll loop is spending too much time message processing. You can address this either by increasing the session timeout or by reducing the maximum size of batches returned in poll() with max.poll.records.
|
||||
2021-02-06 21:19:52,808 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#1-0-C-1] o.a.k.c.c.i.ConsumerCoordinator [ConsumerCoordinator.java:737] [Consumer clientId=consumer-8, groupId=test-consumer-group] Asynchronous auto-commit of offsets {share-0=OffsetAndMetadata{offset=0, metadata=''}} failed: Commit cannot be completed since the group has already rebalanced and assigned the partitions to another member. This means that the time between subsequent calls to poll() was longer than the configured max.poll.interval.ms, which typically implies that the poll loop is spending too much time message processing. You can address this either by increasing the session timeout or by reducing the maximum size of batches returned in poll() with max.poll.records.
|
||||
2021-02-06 21:19:52,808 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#2-0-C-1] o.a.k.c.c.i.ConsumerCoordinator [ConsumerCoordinator.java:737] [Consumer clientId=consumer-2, groupId=test-consumer-group] Asynchronous auto-commit of offsets {delete-0=OffsetAndMetadata{offset=0, metadata=''}} failed: Commit cannot be completed since the group has already rebalanced and assigned the partitions to another member. This means that the time between subsequent calls to poll() was longer than the configured max.poll.interval.ms, which typically implies that the poll loop is spending too much time message processing. You can address this either by increasing the session timeout or by reducing the maximum size of batches returned in poll() with max.poll.records.
|
||||
2021-02-06 21:19:52,808 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#3-0-C-1] o.a.k.c.c.i.ConsumerCoordinator [ConsumerCoordinator.java:737] [Consumer clientId=consumer-4, groupId=test-consumer-group] Asynchronous auto-commit of offsets {comment-0=OffsetAndMetadata{offset=1, metadata=''}, like-0=OffsetAndMetadata{offset=1, metadata=''}, follow-0=OffsetAndMetadata{offset=0, metadata=''}} failed: Commit cannot be completed since the group has already rebalanced and assigned the partitions to another member. This means that the time between subsequent calls to poll() was longer than the configured max.poll.interval.ms, which typically implies that the poll loop is spending too much time message processing. You can address this either by increasing the session timeout or by reducing the maximum size of batches returned in poll() with max.poll.records.
|
||||
2021-02-06 21:19:52,809 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1] o.a.k.c.c.i.ConsumerCoordinator [ConsumerCoordinator.java:759] [Consumer clientId=consumer-6, groupId=test-consumer-group] Synchronous auto-commit of offsets {publish-0=OffsetAndMetadata{offset=2, metadata=''}} failed: Commit cannot be completed since the group has already rebalanced and assigned the partitions to another member. This means that the time between subsequent calls to poll() was longer than the configured max.poll.interval.ms, which typically implies that the poll loop is spending too much time message processing. You can address this either by increasing the session timeout or by reducing the maximum size of batches returned in poll() with max.poll.records.
|
||||
2021-02-06 21:19:52,809 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#2-0-C-1] o.a.k.c.c.i.ConsumerCoordinator [ConsumerCoordinator.java:759] [Consumer clientId=consumer-2, groupId=test-consumer-group] Synchronous auto-commit of offsets {delete-0=OffsetAndMetadata{offset=0, metadata=''}} failed: Commit cannot be completed since the group has already rebalanced and assigned the partitions to another member. This means that the time between subsequent calls to poll() was longer than the configured max.poll.interval.ms, which typically implies that the poll loop is spending too much time message processing. You can address this either by increasing the session timeout or by reducing the maximum size of batches returned in poll() with max.poll.records.
|
||||
2021-02-06 21:19:52,809 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#1-0-C-1] o.a.k.c.c.i.ConsumerCoordinator [ConsumerCoordinator.java:759] [Consumer clientId=consumer-8, groupId=test-consumer-group] Synchronous auto-commit of offsets {share-0=OffsetAndMetadata{offset=0, metadata=''}} failed: Commit cannot be completed since the group has already rebalanced and assigned the partitions to another member. This means that the time between subsequent calls to poll() was longer than the configured max.poll.interval.ms, which typically implies that the poll loop is spending too much time message processing. You can address this either by increasing the session timeout or by reducing the maximum size of batches returned in poll() with max.poll.records.
|
||||
2021-02-06 21:19:52,809 WARN [org.springframework.kafka.KafkaListenerEndpointContainer#3-0-C-1] o.a.k.c.c.i.ConsumerCoordinator [ConsumerCoordinator.java:759] [Consumer clientId=consumer-4, groupId=test-consumer-group] Synchronous auto-commit of offsets {comment-0=OffsetAndMetadata{offset=1, metadata=''}, like-0=OffsetAndMetadata{offset=1, metadata=''}, follow-0=OffsetAndMetadata{offset=0, metadata=''}} failed: Commit cannot be completed since the group has already rebalanced and assigned the partitions to another member. This means that the time between subsequent calls to poll() was longer than the configured max.poll.interval.ms, which typically implies that the poll loop is spending too much time message processing. You can address this either by increasing the session timeout or by reducing the maximum size of batches returned in poll() with max.poll.records.
|
||||
2021-02-06 21:22:32,690 WARN [HikariPool-1 housekeeper] c.z.h.p.HikariPool [HikariPool.java:766] HikariPool-1 - Thread starvation or clock leap detected (housekeeper delta=1m56s920ms673µs100ns).
|
||||
|
14
log/community/warn/log-warn-2021-01-31.0.log
Normal file
14
log/community/warn/log-warn-2021-01-31.0.log
Normal file
@ -0,0 +1,14 @@
|
||||
2021-01-31 12:41:08,662 WARN [http-nio-8080-exec-2] o.s.w.s.m.m.a.ExceptionHandlerExceptionResolver [AbstractHandlerExceptionResolver.java:198] Resolved [java.lang.ClassCastException: java.util.ArrayList cannot be cast to java.lang.String]
|
||||
2021-01-31 12:41:53,872 WARN [http-nio-8080-exec-8] o.s.w.s.m.m.a.ExceptionHandlerExceptionResolver [AbstractHandlerExceptionResolver.java:198] Resolved [java.lang.ClassCastException: java.util.ArrayList cannot be cast to java.lang.String]
|
||||
2021-01-31 12:43:41,269 WARN [http-nio-8080-exec-8] o.s.w.s.m.m.a.ExceptionHandlerExceptionResolver [AbstractHandlerExceptionResolver.java:198] Resolved [java.lang.ClassCastException: java.util.ArrayList cannot be cast to java.lang.String]
|
||||
2021-01-31 12:46:47,290 WARN [http-nio-8080-exec-2] o.s.w.s.m.m.a.ExceptionHandlerExceptionResolver [AbstractHandlerExceptionResolver.java:198] Resolved [java.lang.ClassCastException: java.util.ArrayList cannot be cast to java.lang.String]
|
||||
2021-01-31 12:50:56,887 WARN [http-nio-8080-exec-8] o.s.w.s.m.m.a.ExceptionHandlerExceptionResolver [AbstractHandlerExceptionResolver.java:198] Resolved [java.lang.ClassCastException: java.util.ArrayList cannot be cast to java.lang.String]
|
||||
2021-01-31 13:00:31,265 WARN [http-nio-8080-exec-4] o.s.w.s.m.m.a.ExceptionHandlerExceptionResolver [AbstractHandlerExceptionResolver.java:198] Resolved [org.springframework.web.method.annotation.MethodArgumentTypeMismatchException: Failed to convert value of type 'java.lang.String' to required type 'int'; nested exception is java.lang.NumberFormatException: For input string: "my-reply.html"]
|
||||
2021-01-31 17:00:56,014 WARN [http-nio-8080-exec-3] o.s.w.s.m.m.a.ExceptionHandlerExceptionResolver [AbstractHandlerExceptionResolver.java:198] Resolved [org.apache.ibatis.binding.BindingException: Invalid bound statement (not found): com.greate.community.dao.DiscussPostMapper.updateCommentCount]
|
||||
2021-01-31 17:02:14,746 WARN [http-nio-8080-exec-4] o.s.w.s.m.m.a.ExceptionHandlerExceptionResolver [AbstractHandlerExceptionResolver.java:198] Resolved [org.apache.ibatis.binding.BindingException: Invalid bound statement (not found): com.greate.community.dao.DiscussPostMapper.updateCommentCount]
|
||||
2021-01-31 17:03:10,083 WARN [http-nio-8080-exec-6] o.s.w.s.m.m.a.ExceptionHandlerExceptionResolver [AbstractHandlerExceptionResolver.java:198] Resolved [org.apache.ibatis.binding.BindingException: Invalid bound statement (not found): com.greate.community.dao.DiscussPostMapper.updateCommentCount]
|
||||
2021-01-31 17:03:20,433 WARN [http-nio-8080-exec-10] o.s.w.s.m.m.a.ExceptionHandlerExceptionResolver [AbstractHandlerExceptionResolver.java:198] Resolved [org.apache.ibatis.binding.BindingException: Invalid bound statement (not found): com.greate.community.dao.DiscussPostMapper.updateCommentCount]
|
||||
2021-01-31 17:07:07,426 WARN [http-nio-8080-exec-5] o.s.w.s.m.m.a.ExceptionHandlerExceptionResolver [AbstractHandlerExceptionResolver.java:198] Resolved [org.apache.ibatis.binding.BindingException: Invalid bound statement (not found): com.greate.community.dao.DiscussPostMapper.updateCommentCount]
|
||||
2021-01-31 17:09:49,019 WARN [http-nio-8080-exec-8] o.s.w.s.m.m.a.ExceptionHandlerExceptionResolver [AbstractHandlerExceptionResolver.java:198] Resolved [org.apache.ibatis.binding.BindingException: Invalid bound statement (not found): com.greate.community.dao.DiscussPostMapper.updateCommentCount]
|
||||
2021-01-31 17:26:47,312 WARN [http-nio-8080-exec-1] o.s.w.s.m.m.a.ExceptionHandlerExceptionResolver [AbstractHandlerExceptionResolver.java:198] Resolved [NoNodeAvailableException[None of the configured nodes are available: [{#transport#-1}{GL6pOJCiTqyKMUgG5cRPbg}{127.0.0.1}{127.0.0.1:9300}]]]
|
||||
2021-01-31 17:26:52,632 WARN [http-nio-8080-exec-3] o.s.w.s.m.m.a.ExceptionHandlerExceptionResolver [AbstractHandlerExceptionResolver.java:198] Resolved [NoNodeAvailableException[None of the configured nodes are available: [{#transport#-1}{GL6pOJCiTqyKMUgG5cRPbg}{127.0.0.1}{127.0.0.1:9300}]]]
|
23
log/community/warn/log-warn-2021-02-02.0.log
Normal file
23
log/community/warn/log-warn-2021-02-02.0.log
Normal file
@ -0,0 +1,23 @@
|
||||
2021-02-02 15:02:07,451 WARN [restartedMain] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-02 15:02:09,558 WARN [restartedMain] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-02 15:02:11,762 WARN [restartedMain] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-02 15:02:13,964 WARN [restartedMain] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-02 15:02:16,468 WARN [restartedMain] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-02 15:02:19,481 WARN [restartedMain] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-02 15:02:22,500 WARN [restartedMain] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-02 15:02:25,610 WARN [restartedMain] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-02 15:02:28,818 WARN [restartedMain] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-02 15:02:31,829 WARN [restartedMain] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-02 15:02:34,940 WARN [restartedMain] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-02 15:02:37,947 WARN [restartedMain] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-02 15:02:40,853 WARN [restartedMain] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-02 15:02:43,858 WARN [restartedMain] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-02 15:02:46,763 WARN [restartedMain] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-02 15:02:49,973 WARN [restartedMain] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-02 15:02:52,980 WARN [restartedMain] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-02 15:02:56,187 WARN [restartedMain] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-02 15:02:59,296 WARN [restartedMain] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-02 15:03:02,204 WARN [restartedMain] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-02 15:03:05,210 WARN [restartedMain] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-02 15:03:05,443 WARN [restartedMain] o.s.b.w.s.c.AnnotationConfigServletWebServerApplicationContext [AbstractApplicationContext.java:557] Exception encountered during context initialization - cancelling refresh attempt: org.springframework.context.ApplicationContextException: Failed to start bean 'org.springframework.kafka.config.internalKafkaListenerEndpointRegistry'; nested exception is org.apache.kafka.common.errors.TimeoutException: Timeout expired while fetching topic metadata
|
||||
2021-02-02 15:37:39,179 WARN [restartedMain] o.s.b.w.s.c.AnnotationConfigServletWebServerApplicationContext [AbstractApplicationContext.java:557] Exception encountered during context initialization - cancelling refresh attempt: org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'eventConsumer': Unsatisfied dependency expressed through field 'taskScheduler'; nested exception is org.springframework.beans.factory.NoSuchBeanDefinitionException: No qualifying bean of type 'org.springframework.scheduling.concurrent.ThreadPoolTaskScheduler' available: expected at least 1 bean which qualifies as autowire candidate. Dependency annotations: {@org.springframework.beans.factory.annotation.Autowired(required=true)}
|
34
log/community/warn/log-warn-2021-02-03.0.log
Normal file
34
log/community/warn/log-warn-2021-02-03.0.log
Normal file
@ -0,0 +1,34 @@
|
||||
2021-02-03 18:20:16,770 WARN [main] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-03 18:20:18,880 WARN [main] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-03 18:20:20,986 WARN [main] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-03 18:20:23,205 WARN [main] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-03 18:20:25,713 WARN [main] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-03 18:20:28,721 WARN [main] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-03 18:20:31,931 WARN [main] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-03 18:20:35,139 WARN [main] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-03 18:20:38,349 WARN [main] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-03 18:20:41,255 WARN [main] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-03 18:20:44,364 WARN [main] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-03 18:20:47,574 WARN [main] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-03 18:20:50,687 WARN [main] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-03 18:20:53,598 WARN [main] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-03 18:20:56,808 WARN [main] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-03 18:20:59,919 WARN [main] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-03 18:21:02,829 WARN [main] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-03 18:21:06,039 WARN [main] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-03 18:21:09,047 WARN [main] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-03 18:21:12,260 WARN [main] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-03 18:21:14,760 WARN [main] o.s.w.c.s.GenericWebApplicationContext [AbstractApplicationContext.java:557] Exception encountered during context initialization - cancelling refresh attempt: org.springframework.context.ApplicationContextException: Failed to start bean 'org.springframework.kafka.config.internalKafkaListenerEndpointRegistry'; nested exception is org.apache.kafka.common.errors.TimeoutException: Timeout expired while fetching topic metadata
|
||||
2021-02-03 18:21:32,018 WARN [main] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-03 18:21:34,125 WARN [main] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-03 18:21:36,330 WARN [main] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-03 18:21:38,634 WARN [main] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-03 18:21:41,039 WARN [main] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-03 18:21:43,949 WARN [main] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-03 18:21:46,957 WARN [main] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-03 18:21:49,865 WARN [main] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-03 18:21:52,876 WARN [main] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-03 18:21:55,885 WARN [main] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-03 18:21:58,992 WARN [main] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-03 18:22:02,100 WARN [main] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
||||
2021-02-03 18:22:05,313 WARN [main] o.a.k.c.NetworkClient [NetworkClient.java:671] [Consumer clientId=consumer-1, groupId=test-consumer-group] Connection to node -1 could not be established. Broker may not be available.
|
15
node_modules/.bin/acorn
generated
vendored
Normal file
15
node_modules/.bin/acorn
generated
vendored
Normal file
@ -0,0 +1,15 @@
|
||||
#!/bin/sh
|
||||
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
|
||||
|
||||
case `uname` in
|
||||
*CYGWIN*|*MINGW*|*MSYS*) basedir=`cygpath -w "$basedir"`;;
|
||||
esac
|
||||
|
||||
if [ -x "$basedir/node" ]; then
|
||||
"$basedir/node" "$basedir/../acorn/bin/acorn" "$@"
|
||||
ret=$?
|
||||
else
|
||||
node "$basedir/../acorn/bin/acorn" "$@"
|
||||
ret=$?
|
||||
fi
|
||||
exit $ret
|
17
node_modules/.bin/acorn.cmd
generated
vendored
Normal file
17
node_modules/.bin/acorn.cmd
generated
vendored
Normal file
@ -0,0 +1,17 @@
|
||||
@ECHO off
|
||||
SETLOCAL
|
||||
CALL :find_dp0
|
||||
|
||||
IF EXIST "%dp0%\node.exe" (
|
||||
SET "_prog=%dp0%\node.exe"
|
||||
) ELSE (
|
||||
SET "_prog=node"
|
||||
SET PATHEXT=%PATHEXT:;.JS;=;%
|
||||
)
|
||||
|
||||
"%_prog%" "%dp0%\..\acorn\bin\acorn" %*
|
||||
ENDLOCAL
|
||||
EXIT /b %errorlevel%
|
||||
:find_dp0
|
||||
SET dp0=%~dp0
|
||||
EXIT /b
|
18
node_modules/.bin/acorn.ps1
generated
vendored
Normal file
18
node_modules/.bin/acorn.ps1
generated
vendored
Normal file
@ -0,0 +1,18 @@
|
||||
#!/usr/bin/env pwsh
|
||||
$basedir=Split-Path $MyInvocation.MyCommand.Definition -Parent
|
||||
|
||||
$exe=""
|
||||
if ($PSVersionTable.PSVersion -lt "6.0" -or $IsWindows) {
|
||||
# Fix case when both the Windows and Linux builds of Node
|
||||
# are installed in the same directory
|
||||
$exe=".exe"
|
||||
}
|
||||
$ret=0
|
||||
if (Test-Path "$basedir/node$exe") {
|
||||
& "$basedir/node$exe" "$basedir/../acorn/bin/acorn" $args
|
||||
$ret=$LASTEXITCODE
|
||||
} else {
|
||||
& "node$exe" "$basedir/../acorn/bin/acorn" $args
|
||||
$ret=$LASTEXITCODE
|
||||
}
|
||||
exit $ret
|
15
node_modules/.bin/ansi-html
generated
vendored
Normal file
15
node_modules/.bin/ansi-html
generated
vendored
Normal file
@ -0,0 +1,15 @@
|
||||
#!/bin/sh
|
||||
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
|
||||
|
||||
case `uname` in
|
||||
*CYGWIN*|*MINGW*|*MSYS*) basedir=`cygpath -w "$basedir"`;;
|
||||
esac
|
||||
|
||||
if [ -x "$basedir/node" ]; then
|
||||
"$basedir/node" "$basedir/../ansi-html/bin/ansi-html" "$@"
|
||||
ret=$?
|
||||
else
|
||||
node "$basedir/../ansi-html/bin/ansi-html" "$@"
|
||||
ret=$?
|
||||
fi
|
||||
exit $ret
|
17
node_modules/.bin/ansi-html.cmd
generated
vendored
Normal file
17
node_modules/.bin/ansi-html.cmd
generated
vendored
Normal file
@ -0,0 +1,17 @@
|
||||
@ECHO off
|
||||
SETLOCAL
|
||||
CALL :find_dp0
|
||||
|
||||
IF EXIST "%dp0%\node.exe" (
|
||||
SET "_prog=%dp0%\node.exe"
|
||||
) ELSE (
|
||||
SET "_prog=node"
|
||||
SET PATHEXT=%PATHEXT:;.JS;=;%
|
||||
)
|
||||
|
||||
"%_prog%" "%dp0%\..\ansi-html\bin\ansi-html" %*
|
||||
ENDLOCAL
|
||||
EXIT /b %errorlevel%
|
||||
:find_dp0
|
||||
SET dp0=%~dp0
|
||||
EXIT /b
|
18
node_modules/.bin/ansi-html.ps1
generated
vendored
Normal file
18
node_modules/.bin/ansi-html.ps1
generated
vendored
Normal file
@ -0,0 +1,18 @@
|
||||
#!/usr/bin/env pwsh
|
||||
$basedir=Split-Path $MyInvocation.MyCommand.Definition -Parent
|
||||
|
||||
$exe=""
|
||||
if ($PSVersionTable.PSVersion -lt "6.0" -or $IsWindows) {
|
||||
# Fix case when both the Windows and Linux builds of Node
|
||||
# are installed in the same directory
|
||||
$exe=".exe"
|
||||
}
|
||||
$ret=0
|
||||
if (Test-Path "$basedir/node$exe") {
|
||||
& "$basedir/node$exe" "$basedir/../ansi-html/bin/ansi-html" $args
|
||||
$ret=$LASTEXITCODE
|
||||
} else {
|
||||
& "node$exe" "$basedir/../ansi-html/bin/ansi-html" $args
|
||||
$ret=$LASTEXITCODE
|
||||
}
|
||||
exit $ret
|
15
node_modules/.bin/atob
generated
vendored
Normal file
15
node_modules/.bin/atob
generated
vendored
Normal file
@ -0,0 +1,15 @@
|
||||
#!/bin/sh
|
||||
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
|
||||
|
||||
case `uname` in
|
||||
*CYGWIN*|*MINGW*|*MSYS*) basedir=`cygpath -w "$basedir"`;;
|
||||
esac
|
||||
|
||||
if [ -x "$basedir/node" ]; then
|
||||
"$basedir/node" "$basedir/../atob/bin/atob.js" "$@"
|
||||
ret=$?
|
||||
else
|
||||
node "$basedir/../atob/bin/atob.js" "$@"
|
||||
ret=$?
|
||||
fi
|
||||
exit $ret
|
17
node_modules/.bin/atob.cmd
generated
vendored
Normal file
17
node_modules/.bin/atob.cmd
generated
vendored
Normal file
@ -0,0 +1,17 @@
|
||||
@ECHO off
|
||||
SETLOCAL
|
||||
CALL :find_dp0
|
||||
|
||||
IF EXIST "%dp0%\node.exe" (
|
||||
SET "_prog=%dp0%\node.exe"
|
||||
) ELSE (
|
||||
SET "_prog=node"
|
||||
SET PATHEXT=%PATHEXT:;.JS;=;%
|
||||
)
|
||||
|
||||
"%_prog%" "%dp0%\..\atob\bin\atob.js" %*
|
||||
ENDLOCAL
|
||||
EXIT /b %errorlevel%
|
||||
:find_dp0
|
||||
SET dp0=%~dp0
|
||||
EXIT /b
|
18
node_modules/.bin/atob.ps1
generated
vendored
Normal file
18
node_modules/.bin/atob.ps1
generated
vendored
Normal file
@ -0,0 +1,18 @@
|
||||
#!/usr/bin/env pwsh
|
||||
$basedir=Split-Path $MyInvocation.MyCommand.Definition -Parent
|
||||
|
||||
$exe=""
|
||||
if ($PSVersionTable.PSVersion -lt "6.0" -or $IsWindows) {
|
||||
# Fix case when both the Windows and Linux builds of Node
|
||||
# are installed in the same directory
|
||||
$exe=".exe"
|
||||
}
|
||||
$ret=0
|
||||
if (Test-Path "$basedir/node$exe") {
|
||||
& "$basedir/node$exe" "$basedir/../atob/bin/atob.js" $args
|
||||
$ret=$LASTEXITCODE
|
||||
} else {
|
||||
& "node$exe" "$basedir/../atob/bin/atob.js" $args
|
||||
$ret=$LASTEXITCODE
|
||||
}
|
||||
exit $ret
|
15
node_modules/.bin/autoprefixer
generated
vendored
Normal file
15
node_modules/.bin/autoprefixer
generated
vendored
Normal file
@ -0,0 +1,15 @@
|
||||
#!/bin/sh
|
||||
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
|
||||
|
||||
case `uname` in
|
||||
*CYGWIN*|*MINGW*|*MSYS*) basedir=`cygpath -w "$basedir"`;;
|
||||
esac
|
||||
|
||||
if [ -x "$basedir/node" ]; then
|
||||
"$basedir/node" "$basedir/../autoprefixer/bin/autoprefixer" "$@"
|
||||
ret=$?
|
||||
else
|
||||
node "$basedir/../autoprefixer/bin/autoprefixer" "$@"
|
||||
ret=$?
|
||||
fi
|
||||
exit $ret
|
17
node_modules/.bin/autoprefixer.cmd
generated
vendored
Normal file
17
node_modules/.bin/autoprefixer.cmd
generated
vendored
Normal file
@ -0,0 +1,17 @@
|
||||
@ECHO off
|
||||
SETLOCAL
|
||||
CALL :find_dp0
|
||||
|
||||
IF EXIST "%dp0%\node.exe" (
|
||||
SET "_prog=%dp0%\node.exe"
|
||||
) ELSE (
|
||||
SET "_prog=node"
|
||||
SET PATHEXT=%PATHEXT:;.JS;=;%
|
||||
)
|
||||
|
||||
"%_prog%" "%dp0%\..\autoprefixer\bin\autoprefixer" %*
|
||||
ENDLOCAL
|
||||
EXIT /b %errorlevel%
|
||||
:find_dp0
|
||||
SET dp0=%~dp0
|
||||
EXIT /b
|
18
node_modules/.bin/autoprefixer.ps1
generated
vendored
Normal file
18
node_modules/.bin/autoprefixer.ps1
generated
vendored
Normal file
@ -0,0 +1,18 @@
|
||||
#!/usr/bin/env pwsh
|
||||
$basedir=Split-Path $MyInvocation.MyCommand.Definition -Parent
|
||||
|
||||
$exe=""
|
||||
if ($PSVersionTable.PSVersion -lt "6.0" -or $IsWindows) {
|
||||
# Fix case when both the Windows and Linux builds of Node
|
||||
# are installed in the same directory
|
||||
$exe=".exe"
|
||||
}
|
||||
$ret=0
|
||||
if (Test-Path "$basedir/node$exe") {
|
||||
& "$basedir/node$exe" "$basedir/../autoprefixer/bin/autoprefixer" $args
|
||||
$ret=$LASTEXITCODE
|
||||
} else {
|
||||
& "node$exe" "$basedir/../autoprefixer/bin/autoprefixer" $args
|
||||
$ret=$LASTEXITCODE
|
||||
}
|
||||
exit $ret
|
15
node_modules/.bin/browserslist
generated
vendored
Normal file
15
node_modules/.bin/browserslist
generated
vendored
Normal file
@ -0,0 +1,15 @@
|
||||
#!/bin/sh
|
||||
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
|
||||
|
||||
case `uname` in
|
||||
*CYGWIN*|*MINGW*|*MSYS*) basedir=`cygpath -w "$basedir"`;;
|
||||
esac
|
||||
|
||||
if [ -x "$basedir/node" ]; then
|
||||
"$basedir/node" "$basedir/../browserslist/cli.js" "$@"
|
||||
ret=$?
|
||||
else
|
||||
node "$basedir/../browserslist/cli.js" "$@"
|
||||
ret=$?
|
||||
fi
|
||||
exit $ret
|
17
node_modules/.bin/browserslist.cmd
generated
vendored
Normal file
17
node_modules/.bin/browserslist.cmd
generated
vendored
Normal file
@ -0,0 +1,17 @@
|
||||
@ECHO off
|
||||
SETLOCAL
|
||||
CALL :find_dp0
|
||||
|
||||
IF EXIST "%dp0%\node.exe" (
|
||||
SET "_prog=%dp0%\node.exe"
|
||||
) ELSE (
|
||||
SET "_prog=node"
|
||||
SET PATHEXT=%PATHEXT:;.JS;=;%
|
||||
)
|
||||
|
||||
"%_prog%" "%dp0%\..\browserslist\cli.js" %*
|
||||
ENDLOCAL
|
||||
EXIT /b %errorlevel%
|
||||
:find_dp0
|
||||
SET dp0=%~dp0
|
||||
EXIT /b
|
18
node_modules/.bin/browserslist.ps1
generated
vendored
Normal file
18
node_modules/.bin/browserslist.ps1
generated
vendored
Normal file
@ -0,0 +1,18 @@
|
||||
#!/usr/bin/env pwsh
|
||||
$basedir=Split-Path $MyInvocation.MyCommand.Definition -Parent
|
||||
|
||||
$exe=""
|
||||
if ($PSVersionTable.PSVersion -lt "6.0" -or $IsWindows) {
|
||||
# Fix case when both the Windows and Linux builds of Node
|
||||
# are installed in the same directory
|
||||
$exe=".exe"
|
||||
}
|
||||
$ret=0
|
||||
if (Test-Path "$basedir/node$exe") {
|
||||
& "$basedir/node$exe" "$basedir/../browserslist/cli.js" $args
|
||||
$ret=$LASTEXITCODE
|
||||
} else {
|
||||
& "node$exe" "$basedir/../browserslist/cli.js" $args
|
||||
$ret=$LASTEXITCODE
|
||||
}
|
||||
exit $ret
|
15
node_modules/.bin/cssesc
generated
vendored
Normal file
15
node_modules/.bin/cssesc
generated
vendored
Normal file
@ -0,0 +1,15 @@
|
||||
#!/bin/sh
|
||||
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
|
||||
|
||||
case `uname` in
|
||||
*CYGWIN*|*MINGW*|*MSYS*) basedir=`cygpath -w "$basedir"`;;
|
||||
esac
|
||||
|
||||
if [ -x "$basedir/node" ]; then
|
||||
"$basedir/node" "$basedir/../cssesc/bin/cssesc" "$@"
|
||||
ret=$?
|
||||
else
|
||||
node "$basedir/../cssesc/bin/cssesc" "$@"
|
||||
ret=$?
|
||||
fi
|
||||
exit $ret
|
17
node_modules/.bin/cssesc.cmd
generated
vendored
Normal file
17
node_modules/.bin/cssesc.cmd
generated
vendored
Normal file
@ -0,0 +1,17 @@
|
||||
@ECHO off
|
||||
SETLOCAL
|
||||
CALL :find_dp0
|
||||
|
||||
IF EXIST "%dp0%\node.exe" (
|
||||
SET "_prog=%dp0%\node.exe"
|
||||
) ELSE (
|
||||
SET "_prog=node"
|
||||
SET PATHEXT=%PATHEXT:;.JS;=;%
|
||||
)
|
||||
|
||||
"%_prog%" "%dp0%\..\cssesc\bin\cssesc" %*
|
||||
ENDLOCAL
|
||||
EXIT /b %errorlevel%
|
||||
:find_dp0
|
||||
SET dp0=%~dp0
|
||||
EXIT /b
|
18
node_modules/.bin/cssesc.ps1
generated
vendored
Normal file
18
node_modules/.bin/cssesc.ps1
generated
vendored
Normal file
@ -0,0 +1,18 @@
|
||||
#!/usr/bin/env pwsh
|
||||
$basedir=Split-Path $MyInvocation.MyCommand.Definition -Parent
|
||||
|
||||
$exe=""
|
||||
if ($PSVersionTable.PSVersion -lt "6.0" -or $IsWindows) {
|
||||
# Fix case when both the Windows and Linux builds of Node
|
||||
# are installed in the same directory
|
||||
$exe=".exe"
|
||||
}
|
||||
$ret=0
|
||||
if (Test-Path "$basedir/node$exe") {
|
||||
& "$basedir/node$exe" "$basedir/../cssesc/bin/cssesc" $args
|
||||
$ret=$LASTEXITCODE
|
||||
} else {
|
||||
& "node$exe" "$basedir/../cssesc/bin/cssesc" $args
|
||||
$ret=$LASTEXITCODE
|
||||
}
|
||||
exit $ret
|
15
node_modules/.bin/envify
generated
vendored
Normal file
15
node_modules/.bin/envify
generated
vendored
Normal file
@ -0,0 +1,15 @@
|
||||
#!/bin/sh
|
||||
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
|
||||
|
||||
case `uname` in
|
||||
*CYGWIN*|*MINGW*|*MSYS*) basedir=`cygpath -w "$basedir"`;;
|
||||
esac
|
||||
|
||||
if [ -x "$basedir/node" ]; then
|
||||
"$basedir/node" "$basedir/../envify/bin/envify" "$@"
|
||||
ret=$?
|
||||
else
|
||||
node "$basedir/../envify/bin/envify" "$@"
|
||||
ret=$?
|
||||
fi
|
||||
exit $ret
|
17
node_modules/.bin/envify.cmd
generated
vendored
Normal file
17
node_modules/.bin/envify.cmd
generated
vendored
Normal file
@ -0,0 +1,17 @@
|
||||
@ECHO off
|
||||
SETLOCAL
|
||||
CALL :find_dp0
|
||||
|
||||
IF EXIST "%dp0%\node.exe" (
|
||||
SET "_prog=%dp0%\node.exe"
|
||||
) ELSE (
|
||||
SET "_prog=node"
|
||||
SET PATHEXT=%PATHEXT:;.JS;=;%
|
||||
)
|
||||
|
||||
"%_prog%" "%dp0%\..\envify\bin\envify" %*
|
||||
ENDLOCAL
|
||||
EXIT /b %errorlevel%
|
||||
:find_dp0
|
||||
SET dp0=%~dp0
|
||||
EXIT /b
|
18
node_modules/.bin/envify.ps1
generated
vendored
Normal file
18
node_modules/.bin/envify.ps1
generated
vendored
Normal file
@ -0,0 +1,18 @@
|
||||
#!/usr/bin/env pwsh
|
||||
$basedir=Split-Path $MyInvocation.MyCommand.Definition -Parent
|
||||
|
||||
$exe=""
|
||||
if ($PSVersionTable.PSVersion -lt "6.0" -or $IsWindows) {
|
||||
# Fix case when both the Windows and Linux builds of Node
|
||||
# are installed in the same directory
|
||||
$exe=".exe"
|
||||
}
|
||||
$ret=0
|
||||
if (Test-Path "$basedir/node$exe") {
|
||||
& "$basedir/node$exe" "$basedir/../envify/bin/envify" $args
|
||||
$ret=$LASTEXITCODE
|
||||
} else {
|
||||
& "node$exe" "$basedir/../envify/bin/envify" $args
|
||||
$ret=$LASTEXITCODE
|
||||
}
|
||||
exit $ret
|
15
node_modules/.bin/envinfo
generated
vendored
Normal file
15
node_modules/.bin/envinfo
generated
vendored
Normal file
@ -0,0 +1,15 @@
|
||||
#!/bin/sh
|
||||
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
|
||||
|
||||
case `uname` in
|
||||
*CYGWIN*|*MINGW*|*MSYS*) basedir=`cygpath -w "$basedir"`;;
|
||||
esac
|
||||
|
||||
if [ -x "$basedir/node" ]; then
|
||||
"$basedir/node" "$basedir/../envinfo/dist/cli.js" "$@"
|
||||
ret=$?
|
||||
else
|
||||
node "$basedir/../envinfo/dist/cli.js" "$@"
|
||||
ret=$?
|
||||
fi
|
||||
exit $ret
|
17
node_modules/.bin/envinfo.cmd
generated
vendored
Normal file
17
node_modules/.bin/envinfo.cmd
generated
vendored
Normal file
@ -0,0 +1,17 @@
|
||||
@ECHO off
|
||||
SETLOCAL
|
||||
CALL :find_dp0
|
||||
|
||||
IF EXIST "%dp0%\node.exe" (
|
||||
SET "_prog=%dp0%\node.exe"
|
||||
) ELSE (
|
||||
SET "_prog=node"
|
||||
SET PATHEXT=%PATHEXT:;.JS;=;%
|
||||
)
|
||||
|
||||
"%_prog%" "%dp0%\..\envinfo\dist\cli.js" %*
|
||||
ENDLOCAL
|
||||
EXIT /b %errorlevel%
|
||||
:find_dp0
|
||||
SET dp0=%~dp0
|
||||
EXIT /b
|
18
node_modules/.bin/envinfo.ps1
generated
vendored
Normal file
18
node_modules/.bin/envinfo.ps1
generated
vendored
Normal file
@ -0,0 +1,18 @@
|
||||
#!/usr/bin/env pwsh
|
||||
$basedir=Split-Path $MyInvocation.MyCommand.Definition -Parent
|
||||
|
||||
$exe=""
|
||||
if ($PSVersionTable.PSVersion -lt "6.0" -or $IsWindows) {
|
||||
# Fix case when both the Windows and Linux builds of Node
|
||||
# are installed in the same directory
|
||||
$exe=".exe"
|
||||
}
|
||||
$ret=0
|
||||
if (Test-Path "$basedir/node$exe") {
|
||||
& "$basedir/node$exe" "$basedir/../envinfo/dist/cli.js" $args
|
||||
$ret=$LASTEXITCODE
|
||||
} else {
|
||||
& "node$exe" "$basedir/../envinfo/dist/cli.js" $args
|
||||
$ret=$LASTEXITCODE
|
||||
}
|
||||
exit $ret
|
15
node_modules/.bin/errno
generated
vendored
Normal file
15
node_modules/.bin/errno
generated
vendored
Normal file
@ -0,0 +1,15 @@
|
||||
#!/bin/sh
|
||||
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
|
||||
|
||||
case `uname` in
|
||||
*CYGWIN*|*MINGW*|*MSYS*) basedir=`cygpath -w "$basedir"`;;
|
||||
esac
|
||||
|
||||
if [ -x "$basedir/node" ]; then
|
||||
"$basedir/node" "$basedir/../errno/cli.js" "$@"
|
||||
ret=$?
|
||||
else
|
||||
node "$basedir/../errno/cli.js" "$@"
|
||||
ret=$?
|
||||
fi
|
||||
exit $ret
|
17
node_modules/.bin/errno.cmd
generated
vendored
Normal file
17
node_modules/.bin/errno.cmd
generated
vendored
Normal file
@ -0,0 +1,17 @@
|
||||
@ECHO off
|
||||
SETLOCAL
|
||||
CALL :find_dp0
|
||||
|
||||
IF EXIST "%dp0%\node.exe" (
|
||||
SET "_prog=%dp0%\node.exe"
|
||||
) ELSE (
|
||||
SET "_prog=node"
|
||||
SET PATHEXT=%PATHEXT:;.JS;=;%
|
||||
)
|
||||
|
||||
"%_prog%" "%dp0%\..\errno\cli.js" %*
|
||||
ENDLOCAL
|
||||
EXIT /b %errorlevel%
|
||||
:find_dp0
|
||||
SET dp0=%~dp0
|
||||
EXIT /b
|
18
node_modules/.bin/errno.ps1
generated
vendored
Normal file
18
node_modules/.bin/errno.ps1
generated
vendored
Normal file
@ -0,0 +1,18 @@
|
||||
#!/usr/bin/env pwsh
|
||||
$basedir=Split-Path $MyInvocation.MyCommand.Definition -Parent
|
||||
|
||||
$exe=""
|
||||
if ($PSVersionTable.PSVersion -lt "6.0" -or $IsWindows) {
|
||||
# Fix case when both the Windows and Linux builds of Node
|
||||
# are installed in the same directory
|
||||
$exe=".exe"
|
||||
}
|
||||
$ret=0
|
||||
if (Test-Path "$basedir/node$exe") {
|
||||
& "$basedir/node$exe" "$basedir/../errno/cli.js" $args
|
||||
$ret=$LASTEXITCODE
|
||||
} else {
|
||||
& "node$exe" "$basedir/../errno/cli.js" $args
|
||||
$ret=$LASTEXITCODE
|
||||
}
|
||||
exit $ret
|
15
node_modules/.bin/esparse
generated
vendored
Normal file
15
node_modules/.bin/esparse
generated
vendored
Normal file
@ -0,0 +1,15 @@
|
||||
#!/bin/sh
|
||||
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
|
||||
|
||||
case `uname` in
|
||||
*CYGWIN*|*MINGW*|*MSYS*) basedir=`cygpath -w "$basedir"`;;
|
||||
esac
|
||||
|
||||
if [ -x "$basedir/node" ]; then
|
||||
"$basedir/node" "$basedir/../esprima/bin/esparse.js" "$@"
|
||||
ret=$?
|
||||
else
|
||||
node "$basedir/../esprima/bin/esparse.js" "$@"
|
||||
ret=$?
|
||||
fi
|
||||
exit $ret
|
17
node_modules/.bin/esparse.cmd
generated
vendored
Normal file
17
node_modules/.bin/esparse.cmd
generated
vendored
Normal file
@ -0,0 +1,17 @@
|
||||
@ECHO off
|
||||
SETLOCAL
|
||||
CALL :find_dp0
|
||||
|
||||
IF EXIST "%dp0%\node.exe" (
|
||||
SET "_prog=%dp0%\node.exe"
|
||||
) ELSE (
|
||||
SET "_prog=node"
|
||||
SET PATHEXT=%PATHEXT:;.JS;=;%
|
||||
)
|
||||
|
||||
"%_prog%" "%dp0%\..\esprima\bin\esparse.js" %*
|
||||
ENDLOCAL
|
||||
EXIT /b %errorlevel%
|
||||
:find_dp0
|
||||
SET dp0=%~dp0
|
||||
EXIT /b
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
x
Reference in New Issue
Block a user