Taogen's Blog

Stay hungry stay foolish.

Add watermark

document.getElementsByTagName('body')[0].style.backgroundImage = 'url("data:image/svg+xml;utf8,<svg xmlns=\'http://www.w3.org/2000/svg\' version=\'1.1\' height=\'100px\' width=\'100px\'><text transform=\'translate(20, 100) rotate(-30)\' fill=\'rgba(128,128,128, 0.3)\' font-size=\'20\' >watermark</text></svg>")';
const div = document.createElement("div");
div.innerHTML = `<div id="watermark" style="position: fixed; top: 0px; left: 0px; width: 100%; height: 100%; background-image: url(&quot;data:image/svg+xml;utf8,<svg xmlns='http://www.w3.org/2000/svg' version='1.1' width='440px' height='293.3333333333333px'><text transform='translate(5, 100) rotate(-20)' fill='rgba(128,128,128, 0.3)' font-size='20' >watermark</text></svg>&quot;); background-repeat: repeat; background-size: 300px 200px; pointer-events: none; z-index: 9999; opacity: 0.5;"></div>`;
document.body.appendChild(div);

Clean/Readable

1. Meaningful Names.

Details
  • Use Intention-Revealing Names
  • Make Meaningful Distinctions
  • Use Pronounceable Names
  • Use Searchable Names
  • Pick One Word per Concept
  • Use Solution Domain Names
  • Use Problem Domain Names
  • Avoid Disinformation
  • Avoid Encodings
  • Avoid Mental Mapping
  • Don’t Be Cute
  • Don’t Pun
  • Add Meaningful Context
  • Don’t Add Gratuitous Context

2. Don’t use magic literals (numbers and strings). Using variables or constants with descriptive names. For example, int MAX_LEN = 100;

3. Every function does only one thing.

4. Class should be small.

5. Keep it simple, stupid. Don’t optimize prematurely.

Robust/Secure

Make a robust, well-developed solution. Do a detail design. First, solve the problem. Then, write the code.

Unit tests. Reduce potential errors.

Error handling.

Consider boundary values.

Check for null values.

Avoid hard coded database or other credentials in code. Using environment variables or a Dotenv file.

Maintainable/Extensible/Easy to modify

Make a maintainable and extensible system design and solution. For example:

  • Designing a maintainable and extensible data model.
  • Designing a extensible APIs (an extensible structure of parameters and return data of the API).

Reduce or eliminate duplicate code. Don’t Repeat Yourself.

Place global properties in the project configuration file, not in the source code.

Decoupling modules, class and functions.

SOLID design principles. (Single Responsibility Principle, Open-Closed Principle, Liskov Substitution Principle, Interface Segregation Principle, and Dependency Inversion Principle)

Writing documentation.

Web Frontend/Desktop/Mobile

Consistency

Using the same styles. The same form items in different pages use the same styles of UI components.

Compatibility

Client/web pages can be displayed normally in different devices/browsers and with different resolutions.

lobe-chat is a open-source project for build an AI client. It supports multiple AI providers, such as OpenAI, Claude 3, Gemini and more. It offers several useful features, including Local Large Language Model (LLM) support, Model Visual Recognition, TTS & STT Voice Conversation, Text to Image Generation, Plugin System (Function Calling), Agent Market (GPTs), Progressive Web App (PWA), Mobile Device Adaptation, and Custom Themes.

How to Deploy

Deploying with Docker

# Always pull the latest Docker image before running
docker pull lobehub/lobe-chat
docker run -d \
--name lobe-chat \
--restart always \
-p 3210:3210 \
-e OPENAI_API_KEY=sk-xxxx \
-e ACCESS_CODE=YOUR_PASSWORD \
lobehub/lobe-chat

Deploying to Vercel

You can also fork the lobe-chat project and deploy it to Vercel.

Setting Up lobe-chat

Required Settings

The API key is a required property that must be set.

If you set the OPENAI_API_KEY environment variable when you start the project, you can use the chatbot application directly. lobe-chat will not show an error or prompt you to set an API key. If you want to authenticate users, you can set the ACCESS_CODE environment variable.

If you don’t set the environment variables OPENAI_API_KEY and ACCESS_CODE when you start the project, lobe-chat will show an error on the web page and prompt you to set an API key. You can also set an API key in the settings page before using the chatbot.

Optional Settings

Set Default Agent

Model Settings

  • Model: Choose your preferred language model, such as GPT-4.

Set an API proxy

If you need to use the OpenAI service through a proxy, you can configure the proxy address using the OPENAI_PROXY_URL environment variable:

-e OPENAI_PROXY_URL=https://my-api-proxy.com/v1

If you want to use a localhost proxy

-e OPENAI_PROXY_URL=http://localhost:18080/v1 \
--network="host" \

or

# connect to proxy Docker container
-e OPENAI_PROXY_URL=http://{containerName}:{containerAppPort}/v1 \
--network {someNetwork} \

ChatGPT-Next-Web is an open-source project for building an AI chatbot client. This project is designed to be cross-platform, allowing it to be used on various operating systems. It currently can be used as a web or PWA application, or as a desktop application on Linux, Windows, or macOS. Additionally, it supports several AI providers, including OpenAI and Google AI.

How ChatGPT-Next-Web Works

ChatGPT-Next-Web manages your API keys locally in the browser. When you send a message in the chat box, ChatGPT-Next-Web will, based on your settings, send a request to the AI provider and render the response message.

How to Deploy

Deploying with Docker

# Always pull the latest Docker image before running
docker pull yidadaa/chatgpt-next-web
docker run -d \
--name chatgpt-next-web \
--restart always \
-p 3000:3000 \
yidadaa/chatgpt-next-web

Deploying to Vercel

You can also fork the ChatGPT-Next-Web project and deploy it to Vercel.

Setting Up ChatGPT-Next-Web

Click the settings button in the lower left corner to open the settings.

Required Settings

OpenAI API Key

Before using ChatGPT-Next-Web, you must set your OpenAI API Key in the Settings -> Custom Endpoint -> OpenAI API Key section.

Optional Settings

OpenAI Endpoint

If you have a self-deployed AI service API, you can set the value to something like http://localhost:18080.

Model

You can set your preferred model, such as gpt-4-0125-preview.

Others

Self-deployed AI services

You can use the copilot-gpt4-service to build a self-deployed AI service. To start an AI service, run the following command:

docker run -d \
--name copilot-gpt4-service \
--restart always \
-p 18080:8080 \
aaamoon/copilot-gpt4-service:latest

or

docker network create chatgpt

docker run -d \
--name copilot-gpt4-service \
--restart always \
-p 18080:8080 \
--network chatgpt \
aaamoon/copilot-gpt4-service:latest

OpenAI Proxy

openai-scf-proxy: Use Tencent Cloud Serverless to set up OpenAI proxy in one minute.

Gradle is a build automation tool for multi-language software development. It controls the development process in the tasks of compilation and packaging to testing, deployment, and publishing.

In this post, I will introduce the basic use of Gradle. This post is based on Gradle 8.6 and Kotlin DSL.

Initialize a Gradle Project

You can run the following commands to initialize a Java project with Gradle:

$ gradle init --use-defaults --type java-application

or

$ gradle init \
--type java-application \
--dsl kotlin \
--test-framework junit-jupiter \
--package my.project \
--project-name my-project \
--no-split-project \
--java-version 21

If you wan to create a Spring Boot application, you can use spring initializr.

Configurations

There are two configuration files in Gradle: build.gradle and settings.gradle. They are both important configuration files in a Gradle project, but they serve different purposes.

build.gradle is a script file that defines the configuration of a project. It’s written in the Groovy or Kotlin programming languages, and it specifies how tasks are executed, dependencies are managed, and artifacts are built. This file typically resides in the root directory of your project.

settings.gradle is focused on configuring the structure of a multi-project build and managing the relationships between different projects within it.

Plugins

You can add plugins to configuration file build.gradle.kts like this:

plugins {
java
id("org.springframework.boot") version "3.2.3"
id("io.spring.dependency-management") version "1.1.4"
}

Gradle core plugins:

  • java: Provides support for building any type of Java project.
  • application: Provides support for building JVM-based, runnable applications.

Spring Boot plugins:

  • org.springframework.boot: Spring Boot Gradle Plugin
  • io.spring.dependency-management: A Gradle plugin that provides Maven-like dependency management functionality. It will control the versions of your project’s direct and transitive dependencies.

More plugins:

Setting Properties

The following properties are the common properties for Java projects.

group = "com.example"
version = "0.0.1-SNAPSHOT"
java {
toolchain {
languageVersion = JavaLanguageVersion.of(21)
}
}
// or
java {
sourceCompatibility = "21" // JavaVersion.VERSION_21
targetCompatibility = "21"
}
application {
mainClass = "com.example.Main"
}

Repositories

A Repository is a source for 3rd party libraries.

repositories {
mavenCentral()
}

Declare dependencies

You can declare dependencies in build.gradle.kts like this

dependencies {
// Compile dependency
compileOnly('org.projectlombok:lombok:1.18.30')

// Implementation dependency
implementation("joda-time:joda-time:2.2")
implementation("org.springframework.boot:spring-boot-starter-web")

// Runtime dependency
runtimeOnly("com.mysql:mysql-connector-j")

// Test dependency
testImplementation("junit:junit:4.12")
testImplementation("org.springframework.boot:spring-boot-starter-test")
}

In Gradle, dependencies can be classified into several types based on where they come from and how they are managed. Here are the main dependency types:

  1. Compile Dependencies:
    1. These are dependencies required for compiling and building your project. They typically include libraries and frameworks that your code directly depends on to compile successfully.
    2. Dependencies declared with compile are visible to all modules, including downstream consumers. This means that If Module A has a compile dependency on a library, and Module B depends on Module A, then Module B also has access to that library transitively. However, this also exposes the implementation details of Module A to Module B, potentially causing coupling between modules. In Gradle 3.4 and later, compile is deprecated in favor of implementation.
  2. Implementation Dependencies:
    1. Introduced in Gradle 3.4, these dependencies are similar to compile dependencies but have a more restricted visibility.
    2. They are not exposed to downstream consumers of your library or module. This allows for better encapsulation and prevents leaking implementation details. This means that if Module A has an implementation dependency on a library, Module B, depending on Module A, does not have access to that library transitively. This enhances encapsulation and modularity by hiding implementation details of a module from its consumers. It allows for better dependency management and reduces coupling between modules in multi-module projects.
  3. Runtime Dependencies: Dependencies that are only required at runtime, not for compilation. They are needed to execute your application but not to build it.
  4. Test Dependencies: Dependencies required for testing your code. These include testing frameworks, libraries, and utilities used in unit tests, integration tests, or other testing scenarios.
  5. Optional Dependencies: Dependencies that are not strictly required for your project to function but are nice to have. Gradle does not include optional dependencies by default, but you can specify them if needed.

Tasks

tasks.withType<Test> {
useJUnitPlatform()
}

Run Tasks

To list all the available tasks in the project:

$ gradle tasks

Build Java

Before building a Java project, ensure that the java plugin is added to the configuration file build.gradle.kts.

plugins {
java
}

Running the following command to build the project

$ gradle build

Run Java main class

To run a Java project, ensure that the application plugin and the mainClass configuration are added to the configuration file build.gradle.kts. The application plugin makes code runnable.

plugins {
// Apply the application plugin to add support for building a CLI application in Java.
application
}

application {
// Define the main class for the application.
mainClass = "org.example.App"
}

Running the following command to run the main method of a Java project:

$ gradle run

Gradle Wrapper

The Gradle Wrapper is the preferred way of starting a Gradle build. It consists of a batch script for Windows and a shell script for OS X and Linux. These scripts allow you to run a Gradle build without requiring that Gradle be installed on your system.

The Wrapper is a script that invokes a declared version of Gradle, downloading it beforehand if necessary. As a result, developers can get up and running with a Gradle project quickly.

Gradle Wrapper files:

  • gradle/wrapper/gradle-wrapper.jar: The Wrapper JAR file containing code for downloading the Gradle distribution.
  • gradle/wrapper/gradle-wrapper.properties: A properties file responsible for configuring the Wrapper runtime behavior e.g. the Gradle version compatible with this version.
  • gradlew, gradlew.bat: A shell script and a Windows batch script for executing the build with the Wrapper.

If the project you are working on does not contain those Wrapper files, you can generate them.

$ gradle wrapper

Run tasks with gradlew:

$ ./gradlew tasks
$ ./gradlew build
$ ./gradlew run
$ ./gradlew test

References

[1] Building Java Projects with Gradle

[2] Part 1: Initializing the Project

[3] Build Init Plugin

[4] Gradle Wrapper Reference

[5] Spring Boot Gradle Plugin Reference Guide

Logback is a popular logging framework for Java applications, designed as a successor to the well-known Apache Log4j framework. It’s known for its flexibility, performance, and configurability. Logback is extensively used in enterprise-level Java applications for logging events and messages.

In this post, I will cover various aspects of using Logback with Spring Boot.

Dependencies

Before we can use Logback in a Spring Boot application, we need to add its library dependencies to the project.

<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter</artifactId>
<version>{LATEST_VERSION}</version>
</dependency>

It contains ch.qos.logback:logback-classic and org.slf4j:slf4j-api

Logback Configuration Files

Spring Boot projects use logback-spring.xml or logback.xml in the resources directory as the Logback configuration file by default.

Priority of the Logback default configuration file: logback.xml > logback-spring.xml.

If you want to use a custom filename. You can specify the log configuration file path in application.xml or application.yml. For example:

logging:
config: classpath:my-logback.xml

Logback Basic Configurations

Property

You can define some properties that can be referenced in strings. Common properties: log message pattern, log file path, etc.

<configuration>
<property name="console.log.pattern"
value="%red(%d{yyyy-MM-dd HH:mm:ss}) %green([%thread]) %highlight(%-5level) %boldMagenta(%logger{36}:%line%n) - %msg%n"/>
<property name="file.log.pattern" value="%d{yyyy-MM-dd HH:mm:ss} [%thread] %-5level %logger{36} - %msg%n"/>
<property name="file.log.dir" value="./logs"/>
<property name="file.log.filename" value="mylogs"/>

</configuration>

Appender

Appenders define the destination and formatting (optional) for log messages.

Types of Logback Appenders:

  • ConsoleAppender: Writes log messages to the console window (standard output or error).
  • FileAppender: Writes log messages to a specified file.
  • RollingFileAppender: Similar to FileAppender, but it creates new log files based on size or time intervals, preventing a single file from growing too large.
  • SocketAppender: Sends log messages over a network socket to a remote logging server.
  • SMTPAppender: Sends log messages as email notifications.

ConsoleAppender

<configuration>

<appender name="CONSOLE" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>${console.log.pattern}</pattern>
<charset>utf8</charset>
</encoder>
</appender>

</configuration>

FileAppender

<configuration>

<appender name="FILE" class="ch.qos.logback.core.FileAppender">
<file>${file.log.dir}/${file.log.filename}.log</file>
<encoder>
<pattern>${file.log.pattern}</pattern>
</encoder>
</appender>

</configuration>

RollingFileAppender

<configuration>

<appender name="ROLLING_FILE" class="ch.qos.logback.core.rolling.RollingFileAppender">
<file>${file.log.dir}/${file.log.filename}.log</file>
<rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
<!-- Log file will roll over daily -->
<fileNamePattern>${file.log.pathPattern}</fileNamePattern>
<!-- Keep 30 days' worth of logs -->
<maxHistory>30</maxHistory>
</rollingPolicy>
<filter class="ch.qos.logback.classic.filter.ThresholdFilter">
<!-- Log messages greater than or equal to the level -->
<level>INFO</level>
</filter>
<encoder>
<pattern>${file.log.pattern}</pattern>
</encoder>
</appender>

</configuration>

RollingPolicy

A rollingPolicy is a component attached to specific appenders that dictates how and when log files are automatically managed, primarily focusing on file size and archiving. Its primary function is to prevent log files from becoming excessively large, improving manageability and performance.

Purpose:

  • Prevents large log files: By periodically rolling over (rotating) log files, you avoid single files growing too large, which can be cumbersome to manage and slow down access.
  • Archiving logs: Rolling policies can archive rolled-over log files, allowing you to retain historical logs for analysis or auditing purposes.

Functionality:

  • Triggers rollover: Based on the defined policy, the rollingPolicy determines when to create a new log file and potentially archive the existing one. Common triggers include exceeding a certain file size or reaching a specific time interval (e.g., daily, weekly).
  • Defines archive format: The policy can specify how archived log files are named and organized. This helps maintain a clear structure for historical logs.

Benefits of using rollingPolicy:

  • Manageability: Keeps log files at a manageable size, making them easier to handle and access.
  • Performance: Prevents performance issues associated with excessively large files.
  • Archiving: Allows you to retain historical logs for later use.

Common types of rollingPolicy in Logback:

  • SizeBasedTriggeringPolicy: Rolls over the log file when it reaches a specific size limit (e.g., 10 MB).
  • TimeBasedRollingPolicy: Rolls over the log file based on a time interval (e.g., daily, weekly, monthly).
  • SizeAndTimeBasedRollingPolicy: Combines size and time-based triggers, offering more control over rolling behavior.

TimeBasedRollingPolicy

<configuration>
<appender name="ROLLING_FILE" class="ch.qos.logback.core.rolling.RollingFileAppender">
...
<rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
<!-- Log file will roll over daily -->
<fileNamePattern>${file.log.dir}/${file.log.filename}-%d{yyyy-MM-dd}.log</fileNamePattern>
<!-- Keep 30 days' worth of logs -->
<maxHistory>30</maxHistory>
</rollingPolicy>
...
</appender>
</configuration>

SizeAndTimeBasedRollingPolicy

<configuration>
<appender name="ROLLING_FILE" class="ch.qos.logback.core.rolling.RollingFileAppender">
...
<rollingPolicy class="ch.qos.logback.core.rolling.SizeAndTimeBasedRollingPolicy">
<fileNamePattern>${file.log.dir}/${file.log.filename}-%d{yyyy-MM-dd}.%i.log.gz</fileNamePattern>
<!-- each archived file's size will be max 10MB -->
<maxFileSize>10MB</maxFileSize>
<!-- 30 days to keep -->
<maxHistory>30</maxHistory>
<!-- total size of all archive files, if total size > 100GB, it will delete old archived file -->
<totalSizeCap>100GB</totalSizeCap>
</rollingPolicy>
...
</appender>
</configuration>

Filter

A filter attached to an appender allows you to control which log events are ultimately written to the defined destination (file, console, etc.) by the appender.

Commonly used filters:

  • ThresholdFilter: This filter allows log events whose level is greater than or equal to the specified level to pass through. For example, if you set the threshold to INFO, then only log events with level INFO, WARN, ERROR, and FATAL will pass through.
  • LevelFilter: Similar to ThresholdFilter, but it allows more fine-grained control. You can specify both the level to match and whether to accept or deny log events at that level.

Filter only INFO level log messages.

<configuration>
<appender name="ROLLING_FILE" class="ch.qos.logback.core.rolling.RollingFileAppender">
...
<filter class="ch.qos.logback.classic.filter.LevelFilter">
<level>INFO</level>
<onMatch>ACCEPT</onMatch>
<onMismatch>DENY</onMismatch>
</filter>
...
</appender>
</configuration>

Filter level greater than INFO

<configuration>
<appender name="ROLLING_FILE" class="ch.qos.logback.core.rolling.RollingFileAppender">
...
<filter class="ch.qos.logback.classic.filter.ThresholdFilter">
<level>INFO</level>
</filter>
...
</appender>
</configuration>

Logger

A logger in logback.xml represents a category or source for log messages within your application.

There are two types of logger tags in Logback: <root> and <logger>. They have hierarchical relationships. All <logger> are <root> child logger. Loggers can inherit their parent logger’s configurations. <root> represents the top level in the logger hierarchy which receives all package log messages. <logger> receives log messages from a specified package.

<configuration>

<root level="INFO">
<appender-ref ref="CONSOLE"/>
<appender-ref ref="ROLLING_FILE"/>
</root>

<logger name="com.taogen" level="DEBUG" additivity="false">
<appender-ref ref="CONSOLE"/>
<appender-ref ref="ROLLING_FILE"/>
</logger>

</configuration>
  • <root>: It receive all package log messages.
    • level="INFO": define the default logger level to INFO for all loggers.
    • <appender-ref>: Send messages to CONSOLE and ROLLING_FILE appender.
  • <logger>
    • name="com.taogen: It receive the com.taogen package log messages.
    • level="DEBUG": It overrides the logger level to DEBUG.
    • additivity="false": If the message has been sent to a appender by its parent logger, current logger will not send the message to the same appender again.
    • <appender-ref>: Send message to CONSOLE and ROLLING_FILE appender.

Using Logback

package com.taogen.commons.boot.mybatisplus;

import lombok.extern.slf4j.Slf4j;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.ExtendWith;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.boot.test.context.SpringBootTest;
import org.springframework.test.context.junit.jupiter.SpringExtension;

@SpringBootTest(classes = AppTest.class)
@ExtendWith(SpringExtension.class)
@Slf4j
class LogTest {
private static Logger logger = LoggerFactory.getLogger(LogTest.class);

private static Logger customLogger = LoggerFactory.getLogger("my-custom-log");

@Test
void test1() {
log.debug("This is a debug message");
log.info("This is an info message");
log.warn("This is a warn message");
log.error("This is an error message");

logger.debug("This is a debug message");

customLogger.debug("This is a debug message");
customLogger.info("This is an info message");
customLogger.warn("This is a warn message");
customLogger.error("This is an error message");
}
}

@Slf4j is a Lombok annotation that automatically creates a private static final field named log of type org.slf4j.Logger. This log field is initialized with an instance of the SLF4J logger for the current class.

private static Logger log = LoggerFactory.getLogger(LogTest.class);

The commonly used Logback levels (in order of increasing severity):

  • TRACE: Captures the most detailed information.
  • DEBUG: general application events and progress.
  • INFO: general application events and progress.
  • WARN: potential problems that might not cause immediate failures.
  • ERROR: errors that prevent the program from functioning correctly.

Relationships between Logger object and <logger> in logback.xml

  • <logging> defined in logback.xml usually uses a package path as its name. Otherwise, use a custom name.
  • If you use Logback to print log messages in Java code, first, you need to pass a class or string to LoggerFactory.getLogger() method to get a Logger object, then call logger’s methods, such as debug().
  • If the Logger object is obtained through a class, Logback looks for <logger> from logback.xml using the object’s package or parent package path. If the Logger object is obtained through a string, Logback uses the string to find a custom <logger> from logback.xml.

More Configurations

Custom Loggers

You can create a custom logger by setting a name instead of using a package path as its name.

<configuration>

<logger name="my-custom-log" level="DEBUG" additivity="false">
<appender-ref ref="CONSOLE"/>
<appender-ref ref="ROLLING_FILE_CUSTOM"/>
</logger>

</configuration>

Output log messages:

2024-03-07 09:20:43 [main] INFO  my-custom-log - Hello!
2024-03-07 09:21:57 [main] INFO my-custom-log - Hello!
  • my-custom-log: logger name.

Note that if you use a custom logger, you can’t get class information from the log message.

Configurations for Different Environments

Using <springProfile>

<configuration>
...
. <!-- springProfile: 1) name="dev | test". 2) name="!prod" -->
<springProfile name="dev | test">
<root level="INFO">
<appender-ref ref="CONSOLE"/>
<appender-ref ref="ROLLING_FILE"/>
</root>
<logger name="com.taogen.commons" level="DEBUG" additivity="false">
<appender-ref ref="CONSOLE"/>
<appender-ref ref="ROLLING_FILE"/>
</logger>
</springProfile>
<springProfile name="prod">
<root level="INFO">
<appender-ref ref="CONSOLE"/>
<appender-ref ref="ROLLING_FILE"/>
</root>
</springProfile>
</configuration>

Dynamically set the log configuration file path

You can dynamically set the log configuration file path in application.yml. Different spring boot environments use different log configuration files.

application.yml

logging:
config: classpath:logback-${spring.profiles.active}.xml

logback-dev.xml

<configuration>
...
<root level="INFO">
<appender-ref ref="CONSOLE"/>
<appender-ref ref="ROLLING_FILE"/>
</root>
<logger name="com.taogen.commons" level="DEBUG" additivity="false">
<appender-ref ref="CONSOLE"/>
<appender-ref ref="ROLLING_FILE"/>
</logger>
</configuration>

logback-prod.xml

<configuration>
...
<root level="INFO">
<appender-ref ref="CONSOLE"/>
<appender-ref ref="ROLLING_FILE"/>
</root>
</configuration>

A Complete Example

Goals

  • Properties
    • log patterns, log directory and log filename.
  • Appenders
    • Colorful log pattern for console appender.
    • Console and RollingFile appenders.
    • Time based rolling policy. Roll over daily, Keep 30 days’ worth of logs.
    • Filter log messages in appenders. Separate INFO, ERROR log messages.
  • Loggers
    • Setting loggers of the root and the base package of project.
    • Using custom loggers.
    • Support multiple spring boot environments.
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE configuration>
<configuration>
<!-- Define properties. You can use these properties in appender configurations.-->
<property name="console.log.pattern"
value="%red(%d{yyyy-MM-dd HH:mm:ss}) %green([%thread]) %highlight(%-5level) %boldMagenta(%logger{36}:%line%n) - %msg%n"/>
<property name="file.log.pattern" value="%d{yyyy-MM-dd HH:mm:ss} [%thread] %-5level %logger{36} - %msg%n"/>
<property name="file.log.dir" value="./logs"/>
<property name="file.log.filename" value="mylogs"/>

<!-- Define the CONSOLE appender. 1) log pattern. 2) log file path. -->
<appender name="CONSOLE" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>${console.log.pattern}</pattern>
<charset>utf8</charset>
</encoder>
</appender>

<!-- RollingFileAppender: Adds the capability to perform log file rotation. You can define a rolling policy, specifying criteria such as time-based or size-based rollover. -->
<appender name="ROLLING_FILE_INFO" class="ch.qos.logback.core.rolling.RollingFileAppender">
<file>${file.log.dir}/${file.log.filename}-info.log</file>
<filter class="ch.qos.logback.classic.filter.ThresholdFilter">
<level>INFO</level>
</filter>
<rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
<!-- Log file will roll over daily -->
<fileNamePattern>${file.log.dir}/${file.log.filename}-info-%d{yyyy-MM-dd}.log</fileNamePattern>
<!-- Keep 30 days' worth of logs -->
<maxHistory>30</maxHistory>
</rollingPolicy>
<encoder>
<pattern>${file.log.pattern}</pattern>
</encoder>
</appender>

<appender name="ROLLING_FILE_ERROR" class="ch.qos.logback.core.rolling.RollingFileAppender">
<file>${file.log.dir}/${file.log.filename}-error.log</file>
<filter class="ch.qos.logback.classic.filter.LevelFilter">
<level>ERROR</level>
<onMatch>ACCEPT</onMatch>
<onMismatch>DENY</onMismatch>
</filter>
<rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
<!-- Log file will roll over daily -->
<fileNamePattern>${file.log.dir}/${file.log.filename}-error-%d{yyyy-MM-dd}.log</fileNamePattern>
<!-- Keep 30 days' worth of logs -->
<maxHistory>30</maxHistory>
</rollingPolicy>
<encoder>
<pattern>${file.log.pattern}</pattern>
</encoder>
</appender>

<appender name="ROLLING_FILE_CUSTOM" class="ch.qos.logback.core.rolling.RollingFileAppender">
<file>${file.log.dir}/custom-logs.log</file>
<rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
<!-- Log file will roll over daily -->
<fileNamePattern>${file.log.dir}/custom-logs-%d{yyyy-MM-dd}.log</fileNamePattern>
<!-- Keep 30 days' worth of logs -->
<maxHistory>30</maxHistory>
</rollingPolicy>
<encoder>
<pattern>${file.log.pattern}</pattern>
</encoder>
</appender>

<!-- Define root logger. 1) Set the default level for all loggers. 2) Set which appenders to use. -->
<root level="INFO">
<appender-ref ref="CONSOLE"/>
<appender-ref ref="ROLLING_FILE_INFO"/>
<appender-ref ref="ROLLING_FILE_ERROR"/>
</root>
<!-- custom logger -->
<logger name="my-custom-log" level="INFO" additivity="false">
<appender-ref ref="CONSOLE"/>
<appender-ref ref="ROLLING_FILE_CUSTOM"/>
</logger>
<!-- springProfile: 1) name="dev | test". 2) name="!prod" -->
<springProfile name="dev | test">
<!-- Define loggers. Set log level for specific packages. -->
<logger name="com.taogen.commons" level="DEBUG" additivity="false">
<appender-ref ref="CONSOLE"/>
<appender-ref ref="ROLLING_FILE_INFO"/>
<appender-ref ref="ROLLING_FILE_ERROR"/>
</logger>
</springProfile>
</configuration>

Principles

  • Oriented toward novices. It is assumed that most of the readers of the document are novices. This way you will write more understandable, in-depth, detailed, and readable.

Structure

  • Overall logic: what, why, how, when, where.
  • Try to break out into detailed subdirectories. You can quickly locate what you want to see.

Details

  • The steps should be clear. Label steps 1, 2, and 3.
  • Try to add links to nouns that you can give links to. E.g. official website, explanation of specialized terms.
  • The code field is to be marked with a code, e.g. code.
  • Use tables as much as possible for structured information.
  • Try to use pictures where you can illustrate to make a clearer and more visual illustration. Don’t mind the hassle. It is more visual and easier to read. For example: UML, flow chart.
  • Give a link to the reference content at the end.

Others

  • After writing, read it through at least once. Timely detection and revision of some statement errors, incoherence; inaccuracy and lack of clarity of expression, and so on.
0%